Open WebUI – 当地化摆设大模子仿照 ChatGPT用户界面

打印 上一主题 下一主题

主题 900|帖子 900|积分 2700

Open WebUI介绍:

Open WebUI 是一个仿照 ChatGPT 界面,为当地大语言模子提供图形化界面的开源项目,可以非常方便的调试、调用当地模子。你能用它毗连你在当地的大语言模子(包括 Ollama 和 OpenAI 兼容的 API),也支持远程服务器。Docker 摆设简单,功能非常丰富,包括代码高亮、数学公式、网页浏览、预设提示词、当地 RAG 集成、对话标记、下载模子、谈天记录、语音支持等。
官网地址:https://openwebui.com
GitHub:GitHub - open-webui/open-webui: User-friendly WebUI for LLMs (Formerly Ollama WebUI)

功能:
?? 直观界面:我们的谈天界面深受ChatGPT启发,旨在确保用户获得友爱易用的体验。
?? 相应式计划:无论是在桌面电脑还是移动设备上,都能享受一致而流畅的用户体验。
迅捷相应速度:畅享快速且高效的相应性能。
?? 轻松启动:接纳Docker或Kubernetes(通过kubectl、kustomize或helm工具)实现无缝安装,带给您无烦恼的初始化体验。
代码语法高亮显示:得益于我们的语法高亮功能,您可以享受到更为清晰易读的代码展示效果。
数字化写作 全面支持Markdown和LaTeX:借助全面集成的Markdown和LaTeX功能,全面提拔您的LLM互动体验。
?? 当地RAG集成:步入未来谈天交互的新篇章,我们内建了Retrieval Augmented Generation(RAG)支持,让您可以或许将文档操作无缝融合进谈天流程。只需简单地将文档载入谈天或添加文件至文档库,然后通过#命令即可轻松访问文档内容。此功能如今尚处于alpha测试阶段,我们正不停改进和完满,以确保其稳定性和性能体现到达最优。
总结一下,重点理解为如下三点:


  • Open WebUI 是一个多功能且直观的开源用户界面,与 ollama 共同使用,它作为一个webui,为用户提供了一个私有化的 ChatGPT 体验。
  • Open WebUI 集成了 Retrieval Augmented Generation(RAG)技术,允许用户将文档、网站和视频等作为上下文信息,供 AI 在回答标题时参考,以提供更准确的信息。
  • 通过调解 Top K 值和改进 RAG 模板提示词来进步基于文档的问答系统的准确性。
Q:关于Open WebUI的安全性,尤其是第一次使用还需要注册,注册信息到哪里去了?
open-webui是一个用于构建Web用户界面的开源库,它通常不直接处置处罚数据传输,而是作为前端框架与后端服务器之间的中介。
第一次使用注册信息,是要求您注册成为管理员用户。这确保了如果Open WebUI被外部访问,您的数据仍然是安全的。
需要留意的是,所有东西都是当地的。我们不网络您的数据。当您注册时,所有信息都会留在您的服务器中,永远不会脱离您的设备。
您的隐私和安全是我们的重要任务,确保您的数据始终处于您的控制之下。
参考:?? FAQ | Open WebUI
Q: Why am I asked to sign up Where are my data being sent to
A: We require you to sign up to become the admin user for enhanced security. This ensures that if the Open WebUI is ever exposed to external access, your data remains secure. It’s important to note that everything is kept local. We do not collect your data. When you sign up, all information stays within your server and never leaves your device. Your privacy and security are our top priorities, ensuring that your data remains under your control at all times.

Open WebUI安装:

如今我只在linux环境下,做了安装实践, 在安装过程中,我重点参考了csdn上的这篇文章:
linux环境安装参考: ollama+open-webui,当地摆设本身的大模子
操作步骤很具体,包括安装过程中遇到的报错标题,基本上都可以按照文章中的步骤,逐步执行办理。
还有一篇文章,写的也很具体,如果是windows下安装,建议参考下:
windows环境安装参考:本机摆设大语言模子:Ollama和OpenWebUI实现各大模子的人工智能自由
另外,我使用的centos系统,在安装过程中遇到如下错误:
(open-webui) [root@master open-webui]# npm i
node: /lib64/libm.so.6: version `GLIBC_2.27’ not found (required by node)
node: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.20’ not found (required by node)
node: /lib64/libstdc++.so.6: version `CXXABI_1.3.9’ not found (required by node)
node: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.21’ not found (required by node)
node: /lib64/libc.so.6: version `GLIBC_2.28’ not found (required by node)
node: /lib64/libc.so.6: version `GLIBC_2.25’ not found (required by node)
办理办法:node: /lib64/libm.so.6: version `GLIBC_2.27‘ not found标题办理方案_libm.so.6 glibc2.27-CSDN博客
另外,open-webui安装过程中,需要毗连’https://huggingface.co’ 网站,会报错无法毗连。
我通过简单设置环境变量办理:export HF_ENDPOINT=HF-Mirror
下面给出了我在conda假造环境创建完后,并且npm也安装ok后,open-webui的安装执行过程快照,仅供参考:
  1. (base) [root@master backend]# conda activate open-webui
  2. (open-webui) [root@master backend]# bash start.sh
  3. No WEBUI_SECRET_KEY provided
  4. Loading WEBUI_SECRET_KEY from .webui_secret_key
  5. USER_AGENT environment variable not set, consider setting it to identify your requests.
  6. No sentence-transformers model found with name sentence-transformers/all-MiniLM-L6-v2. Creating a new one with mean pooling.
  7. Traceback (most recent call last):
  8.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/urllib3/connection.py", line 196, in _new_conn
  9.     sock = connection.create_connection(
  10.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/urllib3/util/connection.py", line 85, in create_connection
  11.     raise err
  12.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/urllib3/util/connection.py", line 73, in create_connection
  13.     sock.connect(sa)
  14. OSError: [Errno 101] Network is unreachable
  15. The above exception was the direct cause of the following exception:
  16. Traceback (most recent call last):
  17.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/urllib3/connectionpool.py", line 789, in urlopen
  18.     response = self._make_request(
  19.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/urllib3/connectionpool.py", line 490, in _make_request
  20.     raise new_e
  21.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/urllib3/connectionpool.py", line 466, in _make_request
  22.     self._validate_conn(conn)
  23.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/urllib3/connectionpool.py", line 1095, in _validate_conn
  24.     conn.connect()
  25.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/urllib3/connection.py", line 615, in connect
  26.     self.sock = sock = self._new_conn()
  27.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/urllib3/connection.py", line 211, in _new_conn
  28.     raise NewConnectionError(
  29. urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection object at 0x7f8572048eb0>: Failed to establish a new connection: [Errno 101] Network is unreachable
  30. The above exception was the direct cause of the following exception:
  31. Traceback (most recent call last):
  32.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/requests/adapters.py", line 667, in send
  33.     resp = conn.urlopen(
  34.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/urllib3/connectionpool.py", line 843, in urlopen
  35.     retries = retries.increment(
  36.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/urllib3/util/retry.py", line 519, in increment
  37.     raise MaxRetryError(_pool, url, reason) from reason  # type: ignore[arg-type]
  38. urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /sentence-transformers/all-MiniLM-L6-v2/resolve/main/config.json (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f8572048eb0>: Failed to establish a new connection: [Errno 101] Network is unreachable'))
  39. During handling of the above exception, another exception occurred:
  40. Traceback (most recent call last):
  41.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/huggingface_hub/file_download.py", line 1722, in _get_metadata_or_catch_error
  42.     metadata = get_hf_file_metadata(url=url, proxies=proxies, timeout=etag_timeout, headers=headers)
  43.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
  44.     return fn(*args, **kwargs)
  45.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/huggingface_hub/file_download.py", line 1645, in get_hf_file_metadata
  46.     r = _request_wrapper(
  47.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/huggingface_hub/file_download.py", line 372, in _request_wrapper
  48.     response = _request_wrapper(
  49.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/huggingface_hub/file_download.py", line 395, in _request_wrapper
  50.     response = get_session().request(method=method, url=url, **params)
  51.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/requests/sessions.py", line 589, in request
  52.     resp = self.send(prep, **send_kwargs)
  53.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/requests/sessions.py", line 703, in send
  54.     r = adapter.send(request, **kwargs)
  55.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/huggingface_hub/utils/_http.py", line 66, in send
  56.     return super().send(request, *args, **kwargs)
  57.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/requests/adapters.py", line 700, in send
  58.     raise ConnectionError(e, request=request)
  59. requests.exceptions.ConnectionError: (MaxRetryError("HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /sentence-transformers/all-MiniLM-L6-v2/resolve/main/config.json (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f8572048eb0>: Failed to establish a new connection: [Errno 101] Network is unreachable'))"), '(Request ID: 430abcfa-0ffb-419d-a853-40caed43b5c8)')
  60. The above exception was the direct cause of the following exception:
  61. Traceback (most recent call last):
  62.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/transformers/utils/hub.py", line 399, in cached_file
  63.     resolved_file = hf_hub_download(
  64.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
  65.     return fn(*args, **kwargs)
  66.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/huggingface_hub/file_download.py", line 1221, in hf_hub_download
  67.     return _hf_hub_download_to_cache_dir(
  68.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/huggingface_hub/file_download.py", line 1325, in _hf_hub_download_to_cache_dir
  69.     _raise_on_head_call_error(head_call_error, force_download, local_files_only)
  70.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/huggingface_hub/file_download.py", line 1826, in _raise_on_head_call_error
  71.     raise LocalEntryNotFoundError(
  72. huggingface_hub.utils._errors.LocalEntryNotFoundError: An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on.
  73. The above exception was the direct cause of the following exception:
  74. Traceback (most recent call last):
  75.   File "/root/miniconda3/envs/open-webui/bin/uvicorn", line 8, in <module>
  76.     sys.exit(main())
  77.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/click/core.py", line 1157, in __call__
  78.     return self.main(*args, **kwargs)
  79.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/click/core.py", line 1078, in main
  80.     rv = self.invoke(ctx)
  81.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/click/core.py", line 1434, in invoke
  82.     return ctx.invoke(self.callback, **ctx.params)
  83.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/click/core.py", line 783, in invoke
  84.     return __callback(*args, **kwargs)
  85.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/uvicorn/main.py", line 410, in main
  86.     run(
  87.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/uvicorn/main.py", line 577, in run
  88.     server.run()
  89.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/uvicorn/server.py", line 65, in run
  90.     return asyncio.run(self.serve(sockets=sockets))
  91.   File "/root/miniconda3/envs/open-webui/lib/python3.8/asyncio/runners.py", line 44, in run
  92.     return loop.run_until_complete(main)
  93.   File "uvloop/loop.pyx", line 1517, in uvloop.loop.Loop.run_until_complete
  94.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/uvicorn/server.py", line 69, in serve
  95.     await self._serve(sockets)
  96.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/uvicorn/server.py", line 76, in _serve
  97.     config.load()
  98.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/uvicorn/config.py", line 434, in load
  99.     self.loaded_app = import_from_string(self.app)
  100.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/uvicorn/importer.py", line 19, in import_from_string
  101.     module = importlib.import_module(module_str)
  102.   File "/root/miniconda3/envs/open-webui/lib/python3.8/importlib/__init__.py", line 127, in import_module
  103.     return _bootstrap._gcd_import(name[level:], package, level)
  104.   File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
  105.   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
  106.   File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
  107.   File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
  108.   File "<frozen importlib._bootstrap_external>", line 843, in exec_module
  109.   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  110.   File "/root/open-webui/backend/main.py", line 25, in <module>
  111.     from apps.rag.main import app as rag_app
  112.   File "/root/open-webui/backend/apps/rag/main.py", line 85, in <module>
  113.     embedding_functions.SentenceTransformerEmbeddingFunction(
  114.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/chromadb/utils/embedding_functions.py", line 83, in __init__
  115.     self.models[model_name] = SentenceTransformer(
  116.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/sentence_transformers/SentenceTransformer.py", line 299, in __init__
  117.     modules = self._load_auto_model(
  118.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/sentence_transformers/SentenceTransformer.py", line 1324, in _load_auto_model
  119.     transformer_model = Transformer(
  120.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/sentence_transformers/models/Transformer.py", line 53, in __init__
  121.     config = AutoConfig.from_pretrained(model_name_or_path, **config_args, cache_dir=cache_dir)
  122.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/transformers/models/auto/configuration_auto.py", line 934, in from_pretrained
  123.     config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
  124.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/transformers/configuration_utils.py", line 632, in get_config_dict
  125.     config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
  126.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/transformers/configuration_utils.py", line 689, in _get_config_dict
  127.     resolved_config_file = cached_file(
  128.   File "/root/miniconda3/envs/open-webui/lib/python3.8/site-packages/transformers/utils/hub.py", line 442, in cached_file
  129.     raise EnvironmentError(
  130. OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like sentence-transformers/all-MiniLM-L6-v2 is not the path to a directory containing a file named config.json.
  131. Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.
  132. (open-webui) [root@master backend]# export HF_ENDPOINT=https://hf-mirror.com
  133. (open-webui) [root@master backend]# bash start.sh
  134. No WEBUI_SECRET_KEY provided
  135. Loading WEBUI_SECRET_KEY from .webui_secret_key
  136. USER_AGENT environment variable not set, consider setting it to identify your requests.
  137. modules.json: 100%|██████████████████████████████████████████████████████████████████████████████████████| 349/349 [00:00<00:00, 95.8kB/s]
  138. config_sentence_transformers.json: 100%|█████████████████████████████████████████████████████████████████| 116/116 [00:00<00:00, 25.3kB/s]
  139. README.md: 10.7kB [00:00, 19.1MB/s]
  140. sentence_bert_config.json: 100%|███████████████████████████████████████████████████████████████████████| 53.0/53.0 [00:00<00:00, 25.2kB/s]
  141. config.json: 612B [00:00, 283kB/s]                                                                                                        
  142. model.safetensors: 100%|█████████████████████████████████████████████████████████████████████████████| 90.9M/90.9M [00:16<00:00, 5.67MB/s]
  143. tokenizer_config.json: 100%|█████████████████████████████████████████████████████████████████████████████| 350/350 [00:00<00:00, 71.3kB/s]
  144. vocab.txt: 210kB [00:17, 30.4kB/s]Error while downloading from https://hf-mirror.com/sentence-transformers/all-MiniLM-L6-v2/resolve/main/vocab.txt: HTTPSConnectionPool(host='hf-mirror.com', port=443): Read timed out.
  145. Trying to resume download...
  146. vocab.txt: 232kB [00:00, 24.2MB/s]
  147. vocab.txt: 214kB [00:29, 7.34kB/s]
  148. tokenizer.json: 466kB [00:03, 155kB/s]
  149. special_tokens_map.json: 100%|███████████████████████████████████████████████████████████████████████████| 112/112 [00:00<00:00, 50.8kB/s]
  150. 1_Pooling/config.json: 100%|█████████████████████████████████████████████████████████████████████████████| 190/190 [00:00<00:00, 32.2kB/s]
  151. INFO:     Started server process [71959]
  152. INFO:     Waiting for application startup.
  153. Intialized router with Routing strategy: simple-shuffle
  154. Routing fallbacks: None
  155. Routing context window fallbacks: None
  156. Router Redis Caching=None
  157. #------------------------------------------------------------#
  158. #                                                            #
  159. #           'It would help me if you could add...'            #
  160. #        https://github.com/BerriAI/litellm/issues/new        #
  161. #                                                            #
  162. #------------------------------------------------------------#
  163. Thank you for using LiteLLM! - Krrish & Ishaan
  164. Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
  165. Intialized router with Routing strategy: simple-shuffle
  166. Routing fallbacks: None
  167. Routing context window fallbacks: None
  168. Router Redis Caching=None
  169. INFO:     Application startup complete.
  170. INFO:     Uvicorn running on http://0.0.0.0:8080 (Press CTRL+C to quit)
复制代码
免责声明:如果侵犯了您的权益,请联系站长,我们会及时删除侵权内容,谢谢合作!更多信息从访问主页:qidao123.com:ToB企服之家,中国第一个企服评测及商务社交产业平台。

本帖子中包含更多资源

您需要 登录 才可以下载或查看,没有账号?立即注册

x
回复

使用道具 举报

0 个回复

倒序浏览

快速回复

您需要登录后才可以回帖 登录 or 立即注册

本版积分规则

北冰洋以北

金牌会员
这个人很懒什么都没写!

标签云

快速回复 返回顶部 返回列表