There are a few ways to get the links you're interested in:
- Visit the site's
/sitemap.xml
page. - Convert the XML to JSON using a tool like this one.
- Extract the URLs programmatically from the JSON structure in your browser console.
If the links are on a specific page:
Array.from(document.querySelectorAll('a[class*="sidebar-link_"]')).map(el => el.href);
Adjust the selector if needed, based on the structure of the page.
Once you have the URLs, copy them using copy()
in the browser console, and define them in the NotebookLM browser console tab:
const urls = [
"https://example.com/page1",
"https://example.com/page2",
// etc.
];
- In NotebookLM, with DevTools opened, add a dummy source (e.g.,
https://facebook.com
) to your Notebook. - Open DevTools's Network Tab, and locate the corresponding
fetch
request. - Copy the request as fetch.
It will look something like this:
fetch("https://notebooklm.google.com/...", {
headers: {
...
},
body: `...${encodeURIComponent(url)}...`,
method: "POST",
mode: "cors",
credentials: "include"
});
- Wrap it in a
addSource
function:
const addSource = (url) => {
fetch("https://notebooklm.google.com/...", {
headers: {
...
},
body: `...${encodeURIComponent(url)}...`,
method: "POST",
mode: "cors",
credentials: "include"
});
};
🔧 Replace any hardcoded URL in the request body (e.g.,
https%3A%2F%2Ffacebook.com
) with${encodeURIComponent(url)}
.
Once the function is defined, loop through all the URLs to submit them:
for (const url of urls) {
addSource(url);
}
Don’t refresh the page immediately! Wait until all HTTP requests have been received and processed by Google before refreshing NotebookLM.