马上注册,结交更多好友,享用更多功能,让你轻松玩转社区。
您需要 登录 才可以下载或查看,没有账号?立即注册
x
构建springboot项目,基于jdk17
添加langchain4j依赖(本demo仅支持ollama)
构建好项目之后就开始写代码啦
首先修改application.yml
- server:
- tomcat:
- uri-encoding: utf-8
- port: 8888
- spring:
- # 应用名称
- application:
- name: studyllm
- # 默认执行的配置文件
- main:
- allow-bean-definition-overriding: true
复制代码 接着,新增一个ChatController类
- package com.example.studyllm.ollama;
- import dev.langchain4j.data.message.AiMessage;
- import dev.langchain4j.data.message.SystemMessage;
- import dev.langchain4j.data.message.UserMessage;
- import dev.langchain4j.model.chat.ChatLanguageModel;
- import dev.langchain4j.model.ollama.OllamaChatModel;
- import dev.langchain4j.model.output.Response;
- import org.springframework.web.bind.annotation.GetMapping;
- import org.springframework.web.bind.annotation.RequestMapping;
- import org.springframework.web.bind.annotation.RestController;
- @RestController
- @RequestMapping("/ollama")
- public class ChatController {
- @GetMapping("/chat")
- public String chat(String message){
- ChatLanguageModel model = buildModel();
- Response<AiMessage> response = model.generate
- (new SystemMessage("你是一个国产大模型,请使用中文回复所有问题")
- ,new UserMessage(message));
- return response.content().text();
- }
- private ChatLanguageModel buildModel(){
- return OllamaChatModel.builder()
- .baseUrl("http://127.0.0.1:11434")
- .modelName("qwen:4b")
- .temperature(0.1)
- .logRequests(true)
- .logResponses(true)
- .build();
- }
- }
复制代码 至此,就可以通过http://localhost:8888/ollama/chat 接口跟大模子进行对话啦
若需要使用流式输出可以调整为:
在pom中引入依赖:
- <dependency>
- <groupId>dev.langchain4j</groupId>
- <artifactId>langchain4j-reactor</artifactId>
- <version>0.36.0</version>
- </dependency>
复制代码- package com.example.studyllm.ollama;
- import dev.langchain4j.data.message.AiMessage;
- import dev.langchain4j.data.message.SystemMessage;
- import dev.langchain4j.data.message.UserMessage;
- import dev.langchain4j.memory.chat.MessageWindowChatMemory;
- import dev.langchain4j.model.chat.ChatLanguageModel;
- import dev.langchain4j.model.chat.StreamingChatLanguageModel;
- import dev.langchain4j.model.ollama.OllamaChatModel;
- import dev.langchain4j.model.ollama.OllamaStreamingChatModel;
- import dev.langchain4j.model.output.Response;
- import dev.langchain4j.service.AiServices;
- import org.springframework.web.bind.annotation.GetMapping;
- import org.springframework.web.bind.annotation.RequestMapping;
- import org.springframework.web.bind.annotation.RestController;
- import reactor.core.publisher.Flux;
- @RestController
- @RequestMapping("/ollama")
- public class ChatController {
- @GetMapping(value="/chat", produces = "text/event-stream")
- public Flux<String> chat(){
- String message = "中國首都是哪裡";
- StreamingChatLanguageModel model = buildModel();
- Assistant assistant = AiServices.builder(Assistant.class)
- .streamingChatLanguageModel(model)
- .chatMemory(MessageWindowChatMemory.withMaxMessages(10))
- .build();
- return assistant.chat(message);
- }
- private StreamingChatLanguageModel buildModel(){
- return OllamaStreamingChatModel.builder()
- .baseUrl("http://127.0.0.1:11434")
- .modelName("qwen-4b")
- .temperature(0.1)
- .build();
- }
- }
复制代码
免责声明:如果侵犯了您的权益,请联系站长,我们会及时删除侵权内容,谢谢合作!更多信息从访问主页:qidao123.com:ToB企服之家,中国第一个企服评测及商务社交产业平台。 |