Skip to content

Instantly share code, notes, and snippets.

View Seas0's full-sized avatar
High Round-Trip Time warning.

Seas0

High Round-Trip Time warning.
View GitHub Profile
@thesamesam
thesamesam / xz-backdoor.md
Last active July 8, 2025 19:40
xz-utils backdoor situation (CVE-2024-3094)

FAQ on the xz-utils backdoor (CVE-2024-3094)

This is a living document. Everything in this document is made in good faith of being accurate, but like I just said; we don't yet know everything about what's going on.

Update: I've disabled comments as of 2025-01-26 to avoid everyone having notifications for something a year on if someone wants to suggest a correction. Folks are free to email to suggest corrections still, of course.

Background

import gc
from typing import Tuple
import torch
import torch.nn.functional as F
import triton
import triton.language as tl
import triton.testing
from kernels import get_kernel
@Birch-san
Birch-san / _06_fused_attention_blockptr_jvp.py
Last active July 14, 2025 06:15
Triton fused attention tutorial, updated with JVP support. Albeit with atol=1e-3 accuracy on JVP.
from __future__ import annotations
"""
Fused Attention
===============
This is a Triton implementation of the Flash Attention v2 algorithm from Tri Dao (https://tridao.me/publications/flash2/flash2.pdf)
Credits: OpenAI kernel team
Extra Credits: