【UE5】离线AI聊天-接入LLAMA语言模型 教程
前言:LLAMA是一种神经网络模型,全称为Language Model with an Average Attention Mechanism(具有平均注意机制的语言模型)。它是一种用于自然语言处理任务的模型,特别适用于天生文本和回答题目。LLAMA模型结合了注意力机制和平均池化,以进步模型对输入文本的理解和天生能力。它在多项基准测试中取得了很好的性能,是一种强大的语言模型。
此文章以基于OpenAI聊天模型训练而来的openchat_3.5.Q3_K_L模型为例进行实现。
1.准备工作:(注意打不开的链接需要科学上网
[*] 下载必备软件:MicroSoft VS,CMAKE,Git(这一步就不详写,自行安装
[*] 下载本例子的AI模型:openchat_3.5.Q3_K_L(放入项目目次/Content/Movies/Models/..中https://i-blog.csdnimg.cn/blog_migrate/b3099b0388e41a32f2125d6888ed5048.png
[*] 下载LLAMA插件:Llama-Unreal(我的教程后面修改了部门代码,请支持插件原作者MikaPi
[*] 新建空缺C++项目后关闭引擎,并打开项目文件夹:https://i-blog.csdnimg.cn/blog_migrate/24a20b7491ef6e6e8c0e89833d1b2e02.png
[*] 项目文件夹中创建Plugins文件夹并放入TTS和LLAMA插件:(TTS在上一篇文章有分享https://i-blog.csdnimg.cn/blog_migrate/312322f5a162fe54b7e4e06d4b96362b.png
[*] 进入LLAMA插件文件夹,右键空缺地域打开Git:https://i-blog.csdnimg.cn/blog_migrate/2f9efa580dc67d5b166094c7f4819b65.png
[*] mkdir llama
cd llama
[*] llama文件中放入下载解压好的llama.cpp:llama.cpp兼容版本https://i-blog.csdnimg.cn/blog_migrate/a2c9f63941ca48e6fa20079538206989.png
[*] 创建build文件夹进行cmake编译:
cd llama.cpp
mkdir build
cd build/
cmake .. -DBUILD_SHARED_LIBS=ON
cd ..
cmake --build build --config Release -j --verbose
[*] 天生成功:https://i-blog.csdnimg.cn/blog_migrate/31a16b80a61008bf32dd37fbcab0146c.png
[*] 1.我们需要的文件是llama.dll:复制到Plugins\UELlama\Binaries\Win64文件夹中
https://i-blog.csdnimg.cn/blog_migrate/9277cbb443a9eec38b7bcb7c556a90fa.png
[*] 2.llama插件的Includes和Libraries文件夹中已经有了全部需要的文件,遂不需要复制。https://i-blog.csdnimg.cn/blog_migrate/e3bd85369bc32f09be65b9b2b6624ba5.png
2.项目各项设置及代码:
[*] 修改UELlama.Build.cs:(修复打包后dll缺失
using UnrealBuildTool;
using System.IO;
public class UELlama : ModuleRules
{
public UELlama(ReadOnlyTargetRules Target) : base(Target)
{
PCHUsage = ModuleRules.PCHUsageMode.UseExplicitOrSharedPCHs;
PublicIncludePaths.AddRange(
new string[] {
// ... add public include paths required here ...
}
);
PrivateIncludePaths.AddRange(
new string[] {
}
);
PublicDependencyModuleNames.AddRange(
new string[]
{
"Core",
// ... add other public dependencies that you statically link with here ...
}
);
PrivateDependencyModuleNames.AddRange(
new string[]
{
"CoreUObject",
"Engine",
"Slate",
"SlateCore",
// ... add private dependencies that you statically link with here ...
}
);
if (Target.bBuildEditor)
{
PrivateDependencyModuleNames.AddRange(
new string[]
{
"UnrealEd"
}
);
}
if (Target.Platform == UnrealTargetPlatform.Win64)
{
string PluginBinariesDir = Path.Combine(ModuleDirectory, "..", "..", "Binaries", "Win64");
string ProjectBinariesDir = Path.Combine(ModuleDirectory, "..", "..", "..", "..", "Binaries", "Win64");
string DLLFilePath = Path.Combine(ProjectBinariesDir, "llama.dll");
string DestinationDLLPath = Path.Combine(PluginBinariesDir, "llama.dll");
RuntimeDependencies.Add(DLLFilePath, DestinationDLLPath);
}
DynamicallyLoadedModuleNames.AddRange(
new string[]
{
// ... add any modules that your module loads dynamically here ...
}
);
if (Target.Platform == UnrealTargetPlatform.Linux)
{
PublicAdditionalLibraries.Add(Path.Combine(PluginDirectory, "Libraries", "libllama.so"));
PublicIncludePaths.Add(Path.Combine(PluginDirectory, "Includes"));
}
else if (Target.Platform == UnrealTargetPlatform.Win64)
{
PublicAdditionalLibraries.Add(Path.Combine(PluginDirectory, "Libraries", "llama.lib"));
PublicIncludePaths.Add(Path.Combine(PluginDirectory, "Includes"));
}
}
}
[*] 编译天生项目成功:https://i-blog.csdnimg.cn/blog_migrate/f2b7507cfa44c539ed78f3bc3cf50116.png
[*] 打开项目:
1.新建Blueprints文件夹,新建空缺关卡LLAMA;游戏模式GM_LLAMA(不要创建错成游戏模式底子);玩家控制器PC_LLAMA,HUD类HUD_LLAMA,用户控件WBP_MainLLAMA。
2.项目设置中指定游戏默认地图为LLAMA,世界场景设置中指定游戏模式为GM_LLAMA,控制器为PC_LLAMA,HUD为HUD_LLAMA。https://i-blog.csdnimg.cn/blog_migrate/5d3f81b88c812411e4e9f57e6fc6610b.png
3.编写HUD蓝图和用户控件WBP_MainLLAMA:
(0.HUD蓝图与函数:https://i-blog.csdnimg.cn/blog_migrate/b84880dafdd961e8b64b86a3b9ddf982.png
https://i-blog.csdnimg.cn/blog_migrate/a1482bc85dba2c0f09db246d78c51d4d.pnghttps://i-blog.csdnimg.cn/blog_migrate/dabd6bee4c7ab7e14ff3b46fd303b7c2.png
(1.添加llama组件;
(2.指定Prompt值;
A new line, the value “Human:”, and the value “AI:”.Our goal is to generate only a single line of text that corresponds to the current speaker. (3.指定语言模型的路径;
F:\Projects\UE_Projects\5.1\UE5LLAMA\Content\Movies\Models\openchat_3.5.Q3_K_L.gguf (4.指定Stop Sequences:
best_of;
The completion can’t change the speaker.
The completion won’t allow the speaker to speak twice in a row. https://i-blog.csdnimg.cn/blog_migrate/50c24d704c5d7f630ed6c3b799098dbd.png
(5.编辑用户控件WBP_MainLLAMA:https://i-blog.csdnimg.cn/blog_migrate/878ad917dbfd6e88b9c967cdd483eb01.pnghttps://i-blog.csdnimg.cn/blog_migrate/03b134885fc1f52177a335336d3af7f1.png
添加函数Add Token:https://i-blog.csdnimg.cn/blog_migrate/792255bbcb2732667a1362fe1e65cd19.png
变乱图表:https://i-blog.csdnimg.cn/blog_migrate/a44966f2e4b4bae37f776f6af0adb716.png
User:
{prompt}
GPT4:
https://i-blog.csdnimg.cn/blog_migrate/ad95b2c0312363bbd5e447fb44b5c54c.png
3.编译蓝图,运行测试对话成功,朗读答案成功:(打包后也能成功https://i-blog.csdnimg.cn/blog_migrate/9b2bf561433daa4ffd77bda27ed628ef.png
后言:该项目实现了离线AI聊天功能,响应及时。但现在还有部门题目如:回答中文时部门文字呈现为?号,大概根据不同模型有不同的题目,可以自行测试该网站中的其他语言模型。
希望这篇文章能帮到你!!
免责声明:如果侵犯了您的权益,请联系站长,我们会及时删除侵权内容,谢谢合作!更多信息从访问主页:qidao123.com:ToB企服之家,中国第一个企服评测及商务社交产业平台。
页:
[1]