Skip to content

Instantly share code, notes, and snippets.

@achuthhadnoor
Created April 18, 2023 07:23
Show Gist options
  • Save achuthhadnoor/664593de5883cd413564b894fda12a49 to your computer and use it in GitHub Desktop.
Save achuthhadnoor/664593de5883cd413564b894fda12a49 to your computer and use it in GitHub Desktop.
Timelapse screen recording app to save time for artists, educators and designers
// Step 1: Get default media source (screen)
const mediaSource = navigator.mediaDevices.getDisplayMedia({ video: true });
// Step 2: Record screen and save images as blob arrays every 1 minute
const imageBlobs = [];
const imageCaptureInterval = 60000; // 1 minute
mediaSource.then(screenStream => {
const mediaRecorder = new MediaRecorder(screenStream);
mediaRecorder.ondataavailable = (event) => {
if (event.data && event.data.size > 0) {
imageBlobs.push(event.data);
}
};
mediaRecorder.start(imageCaptureInterval);
// Step 3: Combine blob images using FFmpeg WASM to output a video
mediaRecorder.onstop = () => {
const videoBlob = new Blob(imageBlobs, { type: 'video/webm' });
const videoUrl = URL.createObjectURL(videoBlob);
// Load FFmpeg WASM module
const createFFmpeg = () => {
return new Promise(resolve => {
const script = document.createElement('script');
script.src = 'https://github.com/ffmpegwasm/ffmpeg.wasm/releases/latest/download/ffmpeg.min.js';
script.onload = () => {
resolve(window.FFmpeg);
};
document.head.appendChild(script);
});
};
createFFmpeg().then(FFmpeg => {
// Instantiate FFmpeg WASM
const ffmpeg = FFmpeg.createFFmpeg({ log: true });
// Initialize FFmpeg
ffmpeg.load().then(() => {
// Create virtual input file for the captured video
const inputFilename = 'input.webm';
ffmpeg.FS('writeFile', inputFilename, new Uint8Array(await fetch(videoUrl).then(res => res.arrayBuffer())));
// Define the FFmpeg command to render the video with timelapse effect
const outputFilename = 'output.mp4';
const ffmpegCommand = `-i ${inputFilename} -vf "setpts=0.1*PTS" ${outputFilename}`;
// Run the FFmpeg command
ffmpeg.run(...ffmpegCommand.split(' ')).then(() => {
// Read the rendered video file as Uint8Array
const outputData = ffmpeg.FS('readFile', outputFilename);
// Create Blob from Uint8Array
const renderedVideoBlob = new Blob([outputData.buffer], { type: 'video/mp4' });
const renderedVideoUrl = URL.createObjectURL(renderedVideoBlob);
// This can be used to render the video tag with this url.
});
});
});
};
}).catch(error => {
console.error('Error accessing media devices:', error);
});
/*
Explanation:
This code uses the navigator.mediaDevices.getDisplayMedia() method to capture the screen as the default media source.
It then uses the MediaRecorder API to record the screen and capture images as blob arrays every 1 minute.
Finally, it uses FFmpeg (assumed to be installed and available in the environment) to combine the captured images into a video
with a timelapse effect by setting the video frame rate to 10 frames per second (setpts=0.1*PTS).
This code uses the ffmpeg.wasm library, which is a WASM version of FFmpeg, to perform video rendering in the browser.
It loads the ffmpeg.wasm module, initializes it, and then uses the ffmpeg.run() method to execute FFmpeg commands.
The rendered video is stored as a Blob and can be accessed as a URL for further processing,
such as displaying in a video element or uploading to a server.
Why this code:
I like to solve the problem using less code.For
- better redability,
- coding standards
- write unit test
- version control
*/
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment