ue/book/pwsh/05_aibot.md
2025-03-22 19:08:19 +09:00

1.2 KiB

aibot

aibotの構成です。

title body
aibot ai/bot
server archlinux
os ai/os
ai ai/ai
$ winget install ollama.ollama
$ ollama server
$ ollama run llama3.2

$ winget install --id Python.Python.3.11 -e
$ python --version
$ python -m venv webui
$ cd webui
$ .\Scripts\activate
$ pip install open-webui
$ open-webui serve

http://localhost:8080

LoRA

finetuning

$ conda create -n finetuning python=3.11
$ conda activate finetuning
$ pip install mlx-lm #apple silicon
$ ollama run llama3.2
$ echo "{ \"model\": \"https://huggingface.co/meta-llama/Llama-3.2-3B-Instruct\", \"data\": \"https://github.com/ml-explore/mlx-examples/tree/main/lora/data\" }"|jq .
$ model=meta-llama/Llama-3.2-3B-Instruct
$ data=ml-explore/mlx-examples/lora/data
$ mlx_lm.lora --train --model $model --data $data --batch-size 3

$ ls adapters
$ vim Modelfile
FROM llama3.2:3b
ADAPTER ./adapters

$ ollama create ai -f ./Modelfile

unsloth

$ pip install unsloth
from unsloth import FastLanguageModel

model, tokenizer = FastLanguageModel.from_pretrained(
    model_name="Qwen2.5-1.5B",
    grpo=True
)