Last active
August 12, 2025 05:18
-
-
Save LewisGet/f7d7121820eb76bcc044f00a8bd53b4c to your computer and use it in GitHub Desktop.
私人 llm,用 ollama 批量翻譯或者整理概要,也可以刪除廣告用,目前是在做 role play rpg 的訓練資料
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
from ollama import Client | |
import json | |
import re | |
client = Client( | |
host='http://ollama:11434', | |
headers={'x-some-header': 'some-value'} | |
) | |
f = open("text.txt", "r") | |
content = f.read() | |
chapters = re.findall(r"(第([0-9]*)章)(.*?)(?=第+([0-9]*)+章|\Z)", content, flags=re.DOTALL) | |
chapters_content = [i[2] for i in chapters] | |
ask_content = """請把以下文章寫出故事摘要,放在 json 內,標注出角色,性別,現在位置。格式為 | |
{ | |
"story_summary": "故事摘要", | |
"characters": [ | |
{ | |
"name": "角色名稱", | |
"gender": "角色性別", | |
"current_location": "角色位置" | |
}, | |
{ | |
"name": "角色名稱", | |
"gender": "角色性別", | |
"current_location": "角色位置" | |
} | |
] | |
} | |
只要回傳純 json 給我就好,不要有其他字串。 | |
以下為故事內容: | |
""" | |
ask_all = [ask_content + i for i in chapters_content] | |
for ask, story in zip(ask_all, chapters_content): | |
response = client.chat(model='hf.co/unsloth/Qwen3-0.6B-GGUF:Q4_K_M', think=False, messages = [ | |
{ | |
'role': 'user', | |
'content': ask | |
}, | |
]) | |
with open("story_summary.txt", "a") as f: | |
f.write(f"<|ask|>{story}<|ask|>") | |
f.write(f"<|gpt_start|>{response.message.content}<|gpt_end|>") | |
print(response) |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
version: '3.8' | |
services: | |
ollama: | |
image: ollama/ollama:latest | |
container_name: ollama | |
ports: | |
- "11434:11434" | |
volumes: | |
- ./model:/root/.ollama | |
restart: unless-stopped | |
tty: true | |
open-webui: | |
image: ghcr.io/open-webui/open-webui:main | |
container_name: open-webui | |
ports: | |
- "8080:8080" | |
volumes: | |
- ./frontend:/app/backend/data | |
depends_on: | |
- ollama | |
environment: | |
- OLLAMA_BASE_URL=http://ollama:11434 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment