Skip to content

Instantly share code, notes, and snippets.

@RalkeyOfficial
Created July 5, 2025 12:41
Show Gist options
  • Save RalkeyOfficial/9fd97373d3c0dfa71519b89ff8ac7a8b to your computer and use it in GitHub Desktop.
Save RalkeyOfficial/9fd97373d3c0dfa71519b89ff8ac7a8b to your computer and use it in GitHub Desktop.
Documented installation for 9070XT on WSL2 (Win10, Ubuntu-22.04)

This comes with the assumption that you already know how to setup WSL2.

Support for VPN (Necessary if you use a VPN)

sudo rm /etc/resolv.conf
sudo bash -c 'echo "nameserver 8.8.8.8" > /etc/resolv.conf'
sudo bash -c 'echo "[network]" > /etc/wsl.conf'
sudo bash -c 'echo "generateResolvConf = false" >> /etc/wsl.conf'
sudo chattr +i /etc/resolv.conf

Update system

sudo apt update

Install ROCm (Ubuntu-22.04)

https://rocm.docs.amd.com/projects/radeon/en/latest/docs/install/wsl/install-radeon.html

cd ~
wget https://repo.radeon.com/amdgpu-install/6.4.1/ubuntu/jammy/amdgpu-install_6.4.60401-1_all.deb
sudo apt install ./amdgpu-install_6.4.60401-1_all.deb

Check if installation is succesful

sudo amdgpu-install --list-usecase

Install ROCm and WSL2 support

amdgpu-install -y --usecase=wsl,rocm --no-dkms

Again check if installation is succesful

rocminfo | grep 'Marketing Name'

Build python 3.10.6

Install needed packages

sudo apt install -y \
    build-essential \
    zlib1g-dev \
    libncurses5-dev \
    libgdbm-dev \
    libnss3-dev \
    libssl-dev \
    libreadline-dev \
    libffi-dev \
    curl \
    libsqlite3-dev \
    wget \
    tk-dev \
    libbz2-dev

Build python

cd /tmp
curl -O https://www.python.org/ftp/python/3.10.6/Python-3.10.6.tgz
tar -xf Python-3.10.6.tgz
cd Python-3.10.6
./configure --enable-optimizations
make -j$(nproc)
sudo make altinstall

Ensure correct isntallation

python3.10 --version

Set alias

Temporary

alias python='python3.10'

Permanent (tested for WSL Ubuntu)

echo -e "\nalias python='python3.10'" >> ~/.bashrc && source ~/.bashrc

Installing TCMalloc

sudo apt install libtcmalloc-minimal4 libgoogle-perftools-dev

Install A1111 Webui

Download and setup application

git clone https://github.com/AUTOMATIC1111/stable-diffusion-webui
cd stable-diffusion-webui
python -m venv venv
source venv/bin/activate
python -m pip install --upgrade pip wheel
deactivate
./webui.sh # this will likely fail due to not recognizing CUDA.

Get latest ROCm wheels

https://rocm.docs.amd.com/projects/radeon/en/latest/docs/install/wsl/install-pytorch.html

source venv/bin/activate
wget https://repo.radeon.com/rocm/manylinux/rocm-rel-6.4.1/torch-2.6.0%2Brocm6.4.1.git1ded221d-cp310-cp310-linux_x86_64.whl
wget https://repo.radeon.com/rocm/manylinux/rocm-rel-6.4.1/torchvision-0.21.0%2Brocm6.4.1.git4040d51f-cp310-cp310-linux_x86_64.whl
wget https://repo.radeon.com/rocm/manylinux/rocm-rel-6.4.1/pytorch_triton_rocm-3.2.0%2Brocm6.4.1.git6da9e660-cp310-cp310-linux_x86_64.whl
wget https://repo.radeon.com/rocm/manylinux/rocm-rel-6.4.1/torchaudio-2.6.0%2Brocm6.4.1.gitd8831425-cp310-cp310-linux_x86_64.whl
pip3 uninstall torch torchvision pytorch-triton-rocm
pip3 install torch-2.6.0+rocm6.4.1.git1ded221d-cp310-cp310-linux_x86_64.whl \
    torchvision-0.21.0+rocm6.4.1.git4040d51f-cp310-cp310-linux_x86_64.whl \
    torchaudio-2.6.0+rocm6.4.1.gitd8831425-cp310-cp310-linux_x86_64.whl \
    pytorch_triton_rocm-3.2.0+rocm6.4.1.git6da9e660-cp310-cp310-linux_x86_64.whl

Update WSL runtime lib

location=$(pip show torch | grep Location | awk -F ": " '{print $2}')
cd ${location}/torch/lib/
rm libhsa-runtime64.so*

Run webui

You can now run the webui, make sure to add these arguments --precision full --no-half (but in my experience I found that I don't need that.)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment