Skip to content

Instantly share code, notes, and snippets.

@theSoberSobber
Created April 6, 2025 08:22
Show Gist options
  • Save theSoberSobber/cb3de8dfba5c3a4f73a38be1c927467a to your computer and use it in GitHub Desktop.
Save theSoberSobber/cb3de8dfba5c3a4f73a38be1c927467a to your computer and use it in GitHub Desktop.
Fetch Spy: Monkey Patching Fetch, Import at the top before using any fetch or using any Lib that uses fetch to inspect it's network requests
import { writeFileSync } from "fs";
const originalFetch = globalThis.fetch.bind(globalThis);
function safeStringifyDeep(obj, seen = new WeakSet()) {
if (obj === null || typeof obj !== 'object') return obj;
if (seen.has(obj)) return "<<Circular>>";
seen.add(obj);
if (Array.isArray(obj)) {
return obj.map((item) => {
try {
return safeStringifyDeep(item, seen);
} catch {
return "<<Non-serializable>>";
}
});
}
const result = {};
for (const [k, v] of Object.entries(obj)) {
try {
result[k] = typeof v === 'object' ? safeStringifyDeep(v, seen) : v;
} catch {
result[k] = "<<Non-serializable>>";
}
}
return result;
}
function writeDump(name, obj) {
const ts = new Date().toISOString().replace(/[:.]/g, "-");
const filename = `file_${name}_${ts}.json`;
try {
const data = JSON.stringify(obj, null, 2);
writeFileSync(filename, data);
console.log(`πŸ“ Saved ${name} to ${filename}`);
console.log(`πŸ” ${name} data:\n`, data, "\n");
} catch (err) {
console.error(`❌ Failed to write ${name}:`, err);
}
}
globalThis.fetch = async (url, options = {}) => {
const requestInfo = {
timestamp: new Date().toISOString(),
url,
method: options.method ?? "GET",
headers: options.headers,
body: (() => {
if (!options.body) return null;
try {
return JSON.parse(options.body.toString());
} catch {
return options.body.toString();
}
})(),
};
const safeRequest = safeStringifyDeep(requestInfo);
writeDump("request", safeRequest);
let res;
try {
res = await originalFetch(url, options);
} catch (err) {
console.error("❌ Fetch failed:", err);
throw err;
}
let resClone;
let resText = null;
try {
resClone = res.clone();
resText = await resClone.text();
} catch (err) {
resText = "<<Unable to read response body>>";
}
const responseInfo = {
timestamp: new Date().toISOString(),
status: res.status,
statusText: res.statusText,
headers: Object.fromEntries(res.headers.entries()),
body: (() => {
try {
return JSON.parse(resText);
} catch {
return resText;
}
})(),
};
const safeResponse = safeStringifyDeep(responseInfo);
writeDump("response", safeResponse);
return res;
};
@theSoberSobber
Copy link
Author

use as:

import './fetch-spy.js'; // Must come before anything that uses fetch

import { google } from '@ai-sdk/google';
import { generateText } from 'ai';

const { text } = await generateText({
  model: google('gemini-1.5-pro-latest'),
  prompt: 'Write a vegetarian lasagna recipe for 4 people.',
});

console.log(text);

@theSoberSobber
Copy link
Author

Now you can inspect the final prompts or tool spec that vercel ai sdk sends to LLMs without diving deep into the ridiculously heavy prod code.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment