ollamarundeepseek-r1
在互动环节输入:/set think以启用思考模式
也可以直接在执行命令的时候设置思考模式
ollamarundeepseek-r1--think#开启思考模式ollamarundeepseek-r1--think=false#禁用思考模式
对于直接使用脚本进行推理时,可以使用--hidethinking,这会启用思维模型但只想返回最终的结果,而不包括思考过程。
ollamarundeepseek-r1:8b--hidethinking"9.9与9.11哪一个更大?"
Ollama 的生成 API(/api/generate)和聊天 API(/api/chat)均已更新,以支持思考。
新增了一个think参数,可以设置为true或false用于启用模型的思考过程。当该think参数设置为 true 时,输出将把模型的思考与模型的输出分离。这可以帮助用户打造全新的应用体验,例如通过图形界面以动画形式呈现思考过程,或者让游戏中的 NPC 在输出前显示思考气泡。当该think参数设置为 false 时,模型将不会思考,直接输出内容。
使用 Ollama 聊天 API 并启用思考的示例
curlhttp://localhost:11434/api/chat-d'{"model":"deepseek-r1","messages":[{"role":"user","content":"howmanyrinthewordstrawberry?"}],"think":true,"stream":false}'输出
{"model":"deepseek-r1","created_at":"2025-05-29T09:35:56.836222Z","message":{"role":"assistant","content":"Theword\"strawberry\"contains**three**instancesoftheletter'R'...""thinking":"First,thequestionis:\"howmanyrinthewordstrawberry?\"Ineedtocountthenumberoftimestheletter'r'appearsintheword\"strawberry\".Letmewritedowntheword:...","done_reason":"stop","done":true,"total_duration":47975065417,"load_duration":29758167,"prompt_eval_count":10,"prompt_eval_duration":174191542,"eval_count":2514,"eval_duration":47770692833}}请更新到最新的 Ollama Python 库。
pipinstallollama
fromollamaimportchatmessages=[{'role':'user','content':'Whatis10+23?',},]response=chat('deepseek-r1',messages=messages,think=True)print('Thinking:\n========\n\n'+response.message.thinking)print('\nResponse:\n========\n\n'+response.message.content)请更新到最新的 Ollama JavaScript 库。
npmiollama
importollamafrom'ollama'asyncfunctionmain(){constresponse=awaitollama.chat({model:'deepseek-r1',messages:[{role:'user',content:'Whatis10+23',},],stream:false,think:true,})console.log('Thinking:\n========\n\n'+response.message.thinking)console.log('\nResponse:\n========\n\n'+response.message.content+'\n\n')}main()思考式流式响应示例
importollamafrom'ollama'asyncfunctionmain(){constresponse=awaitollama.chat({model:'deepseek-r1',messages:[{role:'user',content:'Whatis10+23',},],stream:true,think:true,})letstartedThinking=falseletfinishedThinking=falseforawait(constchunkofresponse){if(chunk.message.thinking&&!startedThinking){startedThinking=trueprocess.stdout.write('Thinking:\n========\n\n')}elseif(chunk.message.content&&startedThinking&&!finishedThinking){finishedThinking=trueprocess.stdout.write('\n\nResponse:\n========\n\n')}if(chunk.message.thinking){process.stdout.write(chunk.message.thinking)}elseif(chunk.message.content){process.stdout.write(chunk.message.content)}}}main()暂不支持
| 欢迎光临 链载Ai (https://www.lianzai.com/) | Powered by Discuz! X3.5 |