Skip to content

Instantly share code, notes, and snippets.

@charlespwd
Last active October 21, 2020 14:13
Show Gist options
  • Save charlespwd/f2618a3b8e2122d4b95fd9da5a6fd75c to your computer and use it in GitHub Desktop.
Save charlespwd/f2618a3b8e2122d4b95fd9da5a6fd75c to your computer and use it in GitHub Desktop.
addEventListener('fetch', (event) => {
event.respondWith(handleRequest(event.request))
})
const sleep = ms => new Promise(r => setTimeout(r, ms));
async function handleRequest(request) {
const url = new URL(request.url)
// Disallow crawlers
if (url.pathname === '/robots.txt') {
return new Response('User-agent: *\nDisallow: /', { status: 200 })
}
const qs = url.search
.slice(1)
.split('&')
.map((x) => x.split('='))
.reduce((acc, [k, v]) => {
acc[k] = v
return acc
}, {})
const shouldTransform = qs.path && qs.delay;
if (!shouldTransform) {
return new Response('...')
}
const newUrl = `https://${decodeURIComponent(qs.path)}`;
const res = await fetch(newUrl)
const response = new Response(res.body, res);
response.headers.set('cache-control', 'no-store');
await sleep(qs.delay);
return response;
}
@charlespwd
Copy link
Author

Usage:

https://delay.cpclermont.workers.dev?path=$urlWithoutProtocol&delay=$delayInMs

Example:
https://delay.cpclermont.worker.dev?path=via.placeholder.com/150&delay=3000

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment