Howto run Ollama "local AI" with ROCm on OpenSUSE Tumbleweed / MicroOS / AEON Desktop with AMDGPU / ROCm
ITs just totally easy to install amdgpu-dkms kernel driver with ROCm/AMDGPU on OpenSUSE Tumbleweed or better MicroOS/AEON, if you know what to do...
- Install Longtime Kernel Support + Devel Packages
sudo transactional-update --continue pkg install kernel-longterm kernel-longterm-devel