Skip to content

Instantly share code, notes, and snippets.

I2A0QUY8VU-eyJsaWNlbnNlSWQiOiJJMkEwUVVZOFZVIiwibGljZW5zZWVOYW1lIjoiVU5JVkVSU0lEQURFIEVTVEFEVUFMIERFIENBTVBJTkFTIiwiYXNzaWduZWVOYW1lIjoiVGFvYmFv77yaSkVU5YWo5a625qG25r+AIOa0u+W3peS9nOWupCAgcmVuIHpodW4gZGlhbiBtaW5n77yBIiwiYXNzaWduZWVFbWFpbCI6IlJvYmJ5X1dlbmlnZXJAb3V0bG9vay5jb20iLCJsaWNlbnNlUmVzdHJpY3Rpb24iOiJGb3IgZWR1Y2F0aW9uYWwgdXNlIG9ubHkiLCJjaGVja0NvbmN1cnJlbnRVc2UiOmZhbHNlLCJwcm9kdWN0cyI6W3siY29kZSI6IkRQTiIsInBhaWRVcFRvIjoiMjAyNC0xMC0xNCIsImV4dGVuZGVkIjpmYWxzZX0seyJjb2RlIjoiREIiLCJwYWlkVXBUbyI6IjIwMjQtMTAtMTQiLCJleHRlbmRlZCI6ZmFsc2V9LHsiY29kZSI6IlBTIiwicGFpZFVwVG8iOiIyMDI0LTEwLTE0IiwiZXh0ZW5kZWQiOmZhbHNlfSx7ImNvZGUiOiJJSSIsInBhaWRVcFRvIjoiMjAyNC0xMC0xNCIsImV4dGVuZGVkIjpmYWxzZX0seyJjb2RlIjoiUlNDIiwicGFpZFVwVG8iOiIyMDI0LTEwLTE0IiwiZXh0ZW5kZWQiOnRydWV9LHsiY29kZSI6IkdPIiwicGFpZFVwVG8iOiIyMDI0LTEwLTE0IiwiZXh0ZW5kZWQiOmZhbHNlfSx7ImNvZGUiOiJETSIsInBhaWRVcFRvIjoiMjAyNC0xMC0xNCIsImV4dGVuZGVkIjpmYWxzZX0seyJjb2RlIjoiUlNGIiwicGFpZFVwVG8iOiIyMDI0LTEwLTE0IiwiZXh0ZW5kZWQiOnRydWV9LHsiY29kZSI6IkRTIiwicGFpZFVwVG8iOiIyMDI0LTEwL
@askareija
askareija / INSTALL.md
Last active November 2, 2024 11:13
Install Ollama with AMD GPU (On Laptop) Arch Linux

My laptop specs: MSI Bravo 15 B7E

  • CPU: AMD Ryzen™ 5 7535HS Processor with AMD XDNA™ architecture 6 cores, Max Boost Clock 4.55 GHz
  • GPU: AMD Radeon™ RX 6550M 4GB GDDR6
  • RAM: 32GB DDR5-4800
  1. Clone ollama git clone --recursive https://github.com/ollama/ollama.git
  2. go to directory cd ollama
@augustin-laurent
augustin-laurent / rocm_arch_guide.md
Last active November 14, 2024 18:21
ROCm Installation guide on Arch
Date of the guide : December 17, 2023

Introduction

In this post, I will provide the solution that worked on my system on how to install Radeon Open Compute (ROCm) on Arch (linux-6.6.7.arch1-1) for RX 6900 XT (Should work on other 6000 series). ROCm is an open-source software platform that allows GPU-accelerated computation. This tool is a prerequist to use GPU Acceleration on TensorFlow or PyTorch. In this guide I will use Paru as my AUR package helper, feel free to use any other (https://wiki.archlinux.org/title/AUR_helpers). I will assume you have a working operating system and know what you do with it (Otherwise Arch will be painfull for you).

@ironicbadger
ironicbadger / alexs-results
Last active November 4, 2024 22:12
results
#######
## These results are still open to the public. See
## https://blog.ktz.me/the-best-media-server-cpu-in-the-world/
## for analysis of them.
# https://github.com/ironicbadger/quicksync_calc
# zoidberg - dell 7040 sff pc
CPU TEST FILE BITRATE TIME AVG_FPS AVG_SPEED AVG_WATTS
i5-6600T h264_1080p_cpu ribblehead_1080p_h264 18952 kb/s 116.352s 29.88 1.04x N/A
@ChenyangGao
ChenyangGao / alist.sh
Last active July 15, 2024 07:05
在服务器部署alist和clouddrive(原来只支持openwrt,现在已通用)
#!/usr/bin/env bash
# current_shell_rcfile() {
# if [ -n "$BASH_VERSION" ]; then
# printf "%s\n" ~/.bashrc
# elif [ -n "$ZSH_VERSION" ]; then
# printf "%s\n" ~/.zshrc
# elif [ -n "$FISH_VERSION" ]; then
# printf "%s\n" ~/.config/fish/config.fish
# elif [ -n "$XONSH_VERSION" ]; then
@geerlingguy
geerlingguy / stable-diffusion-ubuntu-2004-amd.sh
Last active March 15, 2024 06:52
Install Stable Diffusion on an AMD GPU PC running Ubuntu 20.04
# Note: This will only work on Navi21 GPUs (6800/6900+).
# See: https://github.com/RadeonOpenCompute/ROCm/issues/1668#issuecomment-1043994570
# Install Conda (latest from https://docs.conda.io/en/latest/miniconda.html#linux-installers)
wget https://repo.anaconda.com/miniconda/Miniconda3-py39_4.12.0-Linux-x86_64.sh
bash Miniconda3-py39_4.12.0-Linux-x86_64.sh
# follow the prompts to install it, and run `conda` to make sure it's working.
# Install git and curl, and clone the stable-diffusion repo
sudo apt install -y git curl
@mr-karan
mr-karan / deployment.hcl
Last active December 26, 2023 10:30
Single Node nomad and consul
job "hello-world" {
datacenters = ["dc1"]
namespace = "default"
type = "service"
group "redis" {
# Specify number of replicas of redis needed.
count = 1
# Specify networking for the group, port allocs.
@aleksasiriski
aleksasiriski / proxmoxlxcjellyfin.md
Last active October 15, 2024 19:02
Proxmox LXC Alpine Docker Jellyfin

How to setup VA-API within Proxmox LXC Unprivileged container

Proxmox configuration

No drivers need to be installed on the proxmox, from now called host.

Find GIDs of video and render group on host:

cat /etc/group | grep video

cat /etc/group | grep render

@bearlike
bearlike / clear_systemd_journal_logs.md
Created March 8, 2022 07:59
Clear Systemd Journal Logs in Ubuntu 20.04

Clear Systemd Journal Logs in Ubuntu 20.04

Systemd has its own logging system called the journal, and the log files are stored in /var/log/journal.

Check current disk usage of journal files

sudo journalctl --disk-usage

Rotate journal files

  • Active journal files will be marked as archived, so that they are never written to in future.
#!/bin/bash
export LC_CTYPE=en_US.UTF-8
export LC_ALL=en_US.UTF-8
# Must run on Proxmox VE 7 server
# Not sure how to handle a cluster - either run on each node or copy template after creating on one?
# e.g. $ ssh [email protected] < proxmox-create-cloud-template.sh
SRC_IMG="https://cloud-images.ubuntu.com/focal/current/focal-server-cloudimg-amd64-disk-kvm.img"