Skip to content

Instantly share code, notes, and snippets.

@torsten-online
torsten-online / OpenSuSE_MicroOS_ROCm_CUDA_ollama.md
Last active October 21, 2024 04:48
OpenSUSE MicroOS Howto with AMDGPU / ROCm - To run CUDA AI Apps like Ollama

Howto run Ollama "local AI" with ROCm on OpenSUSE Tumbleweed / MicroOS / AEON Desktop with AMDGPU / ROCm

ITs just totally easy to install amdgpu-dkms kernel driver with ROCm/AMDGPU on OpenSUSE Tumbleweed or better MicroOS/AEON, if you know what to do...

  • Install Longtime Kernel Support + Devel Packages
sudo transactional-update --continue pkg install kernel-longterm kernel-longterm-devel

OpenSuSE MicroOS OpenGL Virtio Support for WindowsVMs

Its possible, if you know - to use OpenGL Support over Spice at Virt-Manager, e.g. for a WindowsVM Instance.

Just install the following dependencies to the Host:

transactional-update pkg install qemu-ui-spice-core libvirglrenderer1 qemu-ui-opengl 

Podman Compose with OpenSuSE MicroOS - Container Autostart on System-Boot

It is really easy to configure a user-space systemd autostart for Containers on OpenSuSE MicroOS with Podman-Compose.

When your Podman Container YourContainerService is running, then generate the systemd user unit:

podman generate systemd --new YourContainerService > ~/.config/systemd/user/YourContainer.service