Skip to content

Instantly share code, notes, and snippets.

@timohausmann
Last active December 2, 2022 13:51
Show Gist options
  • Save timohausmann/5d32e6db248d24dadf7d8190dcf824ef to your computer and use it in GitHub Desktop.
Save timohausmann/5d32e6db248d24dadf7d8190dcf824ef to your computer and use it in GitHub Desktop.
useAlphaVideoTexture Hook for React Three Fiber

Playing transparent videos with alphachannel currently requires two file formats / codecs: webm (vp9) and mov (hvc1) for Safari.

This hook makes it easy to use two files as input and pick the right one. (Variation of useVideoTexture)

Usage (Vite example):

import { Billboard } from '@react-three/drei';
import { useAlphaVideoTexture } from './useAlphaVideoTexture';
import videoSrcWebm from './your-path/your-video.webm';
import videoSrcMov from './your-path/your-video.mov';

export function VideoDemo(props:Props) {

    const videoTexture = useAlphaVideoTexture(props.srcWebm, props.srcMov, {});
    return (
        <Billboard>
            <mesh>
                <planeGeometry />
                <meshBasicMaterial
                    map={videoTexture}
                    transparent
                />
            </mesh>
        </Billboard>
    );
}
/**
* Variation of https://github.com/pmndrs/drei/blob/master/src/core/useVideoTexture.tsx
* to support webm+mov alpha
*/
import * as THREE from 'three'
import { useEffect, useMemo } from 'react'
import { useThree } from '@react-three/fiber'
import { suspend, preload, clear } from 'suspend-react'
interface VideoTextureProps extends HTMLVideoElement {
unsuspend?: 'canplay' | 'canplaythrough' | 'loadstart' | 'loadedmetadata'
start?: boolean
}
export function useAlphaVideoTexture(srcWebm: string, srcMov: string, props: Partial<VideoTextureProps>) {
const canPlayHevc1 = useMemo(() => {
return document.createElement('video').canPlayType('video/mp4; codecs="hvc1"') === 'probably';
}, []);
const { unsuspend, start, crossOrigin, muted, loop, ...rest } = {
unsuspend: 'loadedmetadata',
crossOrigin: 'Anonymous',
muted: true,
loop: true,
start: true,
playsInline: true,
...props,
}
const gl = useThree((state) => state.gl)
const texture = suspend<[string,string,boolean], () => Promise<THREE.VideoTexture>>(
() =>
new Promise((res, rej) => {
const video = Object.assign(document.createElement('video'), {
src: canPlayHevc1 ? srcMov : srcWebm,
type: canPlayHevc1 ? 'video/mp4; codecs="hvc1"' : 'video/webm',
crossOrigin,
loop,
muted,
...rest,
})
const texture = new THREE.VideoTexture(video)
texture.encoding = gl.outputEncoding
video.addEventListener(unsuspend, () => res(texture))
}),
[srcWebm, srcMov, canPlayHevc1]
)
useEffect(() => void (start && texture.image.play()), [texture])
return texture
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment