This tiny server facilitates automatic updates for WebExtensions released on GitHub.
Please read the README for more information.
This tiny server facilitates automatic updates for WebExtensions released on GitHub.
Please read the README for more information.
docker-compose.override.yml |
While, in most cases, the stable channel of an extension should be hosted on AMO, this does not work well for beta/dev/... releases (anymore). An alternative is to upload the pre-release bundles to GitHub releases (e.g. through a CI) and have the browsers update it from there. For the browser to check for updates, it needs a constant URL that points to the the latest version. GitHub releases unfortunately don't support this out of the box.
This server therefore generates redirects from per extension constant URLs to the latest version of the extension, in a format that is supported by Firefox's extension update protocol. It can be used with any extension that provides versioned GitHub releases, and provides two URLs for each extension: one for the browser to check for updates, and one to be called to purge the redirect from a 24 hour cache.
Add this JSON snippet to your (pre-release) manifest.json
:
"applications": {
"gecko": {
"id": "<extension-id>",
"update_url": "<base_url>/xpi.json?user=<github-username>&repo=<github-reponame>&id=<extension-id>"
}
},
Where <base_url>
is the base URL this server is hosted at (see Setup), <github-username>
and <github-reponame>
describe your extension repository, and the two <extension-id>
s must be identical.
Then, every time after uploading a new release, do a DELETE
on the update_url
to purge the cache item, e.g.:
wget --method=DELETE -qO- '<base_url>/xpi.json?user=<user>&repo=<repo>&id=<id>' || echo failed
On the next update check, Firefox will then update the extension to the new release.
To force an update check for all extensions, use the "Check for Updates" menu option on about:addons
, to force it for a single extension only, see this gist.
The server will fetch the latest release of the repo in question from GitHub, and looks for the first .xpi
file therein.
The version of the release is read from the name of that file, if it has the name as returned by the AMO signing API call, or is otherwise set to the releases tag name.
The returned JSON update manifest will list that version and file as the only version, which will prompt Firefox to update the extension if the version is greater than the one installed.
To run this server in production, either run it as a local program, e.g. with pm2
, or run it with Docker.
Either way, it will also require a reverse proxy, e.g. nginx
, to handle TLS/HTTPS, whose configuration defines the <base_url>
used above.
To start with docker, download/clone the repo, make it readable by world (or the correct UID), cd
into it, optionally create a docker-compose.override.yml
, then run:
docker-compose up -d
Since the server needs outbound traffic, make sure the network setup works (e.g. set dns
in the override config).
Check the status with docker ps
and docker logs web-ext-updater
.
After an update to docker-compose*.yml
, run docker-compose up -d
again.
After other updates, run docker-compose restart
.
A sensible nginx
config will vastly depend on how the system is otherwise set up, but it could include this:
server {
server_name <hostname>;
location / { proxy_pass http://127.0.0.1:40080$request_uri; }
listen 443 ssl http2; listen [::]:443 ssl http2;
location /.well-known/acme-challenge/ { root /var/www/html; }
ssl_certificate /etc/letsencrypt/live/<hostname>/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/<hostname>/privkey.pem;
listen 80; listen [::]:80;
if ($scheme != "https") { return 301 https://$host$request_uri; }
add_header Strict-Transport-Security "max-age=63072000" always;
}
Setting headers via proxy_set_header
isn't really necessary, since they aren't accessed by the server.
The MIT License (MIT)
Copyright (c) 2018 Niklas Gollenstede
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
version: '2.1' | |
services: | |
node: | |
image: node:12-alpine | |
container_name: web-ext-updater | |
environment: | |
- PORT=${PORT:-40080} | |
volumes: | |
- .:/usr/src/app:ro | |
entrypoint: su node -c 'node /usr/src/app/index.js' | |
ports: | |
- "127.0.0.1:${PORT:-40080}:${PORT:-40080}" | |
restart: unless-stopped |
#!/usr/bin/env nodejs | |
'use strict'; /* globals Buffer, */ | |
/** | |
* This script runs a server that, given a GitHub repo and an extension ID, returns a | |
* Mozilla WebExtension update manifest to the `.xpi` file in the latest release of that repo. | |
* | |
* API: | |
* * GET `xpi.json?user=<user>&repo=<repo>&id=<id>`: Get the update manifest. | |
* Fetches the information about the latest release on the specified repo, | |
* extracts the download URL for the first `.xpi` file it finds in the release, | |
* extracts the version number from the filename or release tag and | |
* returns a update manifest `.json` which lists the extension id with that version and URL. | |
* * DELETE `xpi.json?user=<user>&repo=<repo>&id=<id>`: Clear the cache for that update manifest. | |
* can be called during the automated build of the next release. | |
* Returns 200 immediately but deletes after a 10sec delay, so it can be run | |
* during th build before the publication is actually done. | |
*/ | |
const PORT = process.env.PORT || 40080; | |
const QS = require('querystring'), https = require('https'), URL = require('url'); | |
const cache = new Map/*{timer,promise,value,evicted}*/; | |
require('http').createServer(async (req, res) => { { | |
console.log('request', req.method, req.url); | |
} try { switch (req.method) { | |
case 'DELETE': case 'GET': { | |
const [ , path, search, ] = (/^(.*?)(?:\?(.*))?$/).exec(req.url); | |
if (search && search.length > 500) { return reply(res, 414); } | |
const query = search ? QS.parse(search) : { }; Object.entries(query).forEach(([ key, value, ]) => Array.isArray(value) && (query[key] = value[0])); | |
switch (path) { | |
case '/xpi.json': { | |
return (await methods[req.method](query, res)); | |
} | |
} | |
return reply(res, 404); | |
} | |
default: return reply(res, 406); | |
} } catch(error) { error = error || { }; | |
error.message && console.error('server error', error); return reply(res, error.status || 500); | |
} }).listen(PORT, () => console.log(`Listening on `, PORT)); | |
function reply(res, code, message = null) { | |
if (typeof message === 'string') { message = Buffer.from(message, 'utf-8'); } | |
if (!Buffer.isBuffer(message)) { message = null; code = 500; console.error('Invalid reply.'); } | |
console.log('reply', code); | |
res.writeHead(code, { | |
'Content-Length': message ? message.length : 0, | |
}); res.end(message); | |
} | |
function replyJson(res, json) { | |
res.setHeader('Content-Type', 'application/json'); | |
reply(res, 200, json); | |
} | |
function verifyQuery(query) { | |
const { user, repo, id, } = query; | |
if (!word(user) || !word(repo) || !word(id)) { return null; } | |
return user +'/'+ repo +'/'+ id; | |
function word(it) { return it && (/^[\w.@-]+$/).test(it); } | |
} | |
const methods = { | |
DELETE(query, res) { | |
const key = verifyQuery(query); if (!key) { return reply(res, 400, 'bad query'); } | |
const cached = cache.get(key); if (cached && !cache.evicted) { | |
cached.evicted = true; cache.timer && clearTimeout(cache.timer); | |
setTimeout(() => cache.delete(key) && console.info('deleted', key), 10e3); // make sure the update has reached GitHub | |
} | |
setTimeout(() => cache.delete(key) && console.info('deleted', key), 10e3); // make sure the update has reached GitHub | |
return reply(res, 200, 'done\n'); | |
}, | |
async GET(query, res) { | |
const key = verifyQuery(query); if (!key) { return reply(res, 400, 'bad query'); } | |
const cached = cache.get(key); if (cached && (cached.promise || cached.value)) { | |
return replyJson(res, cached.promise ? (await cached.promise) : cached.value); | |
} | |
const promise = getUpdateJson(query); | |
const entry = { promise, value: null, timer: null, evicted: false, }; cache.set(key, entry); | |
try { | |
const value = (await promise); | |
if (!entry.evicted) { | |
entry.promise = null; entry.value = value; | |
entry.timer = setTimeout(() => cache.delete(key) && console.info('deleted', key), 24 * 3600e3); | |
} | |
return replyJson(res, value); | |
} catch (error) { | |
entry.promise = null; entry.evicted = true; | |
cache.delete(key); throw error; | |
} | |
}, | |
}; | |
async function getUpdateJson(query) { | |
const { user, repo, id, } = query; | |
const url = `https://api.github.com/repos/${user}/${repo}/releases/latest`; | |
console.info('fetching', url, 'for', id); | |
let json; try { json = (await fetchJson(url)); } | |
catch (error) { console.error(error); throw { status: 502, }; } // eslint-disable-line no-throw-literal | |
const file = json && Array.isArray(json.assets) && json.assets.find(_=>_.name.endsWith('.xpi')); | |
if (!file) { throw { status: 502, }; } // eslint-disable-line no-throw-literal | |
const fromFile = (/^[\w-]+-(\d+[.]\d+[.]\d+(?:[.a-z]\d+)?)-an\.fx\.xpi$/).exec(file.name); | |
const version = fromFile ? fromFile[1] : json.name.replace(/^v/, ''); | |
const manifest = { addons: { [id]: { updates: [ { | |
version, update_link: file.browser_download_url || file.url, | |
}, ], }, }, }; | |
return Buffer.from(JSON.stringify(manifest) +'\n', 'utf-8'); | |
} | |
async function fetchJson(url) { return new Promise((resolve, reject) => { // TODO: check/add timeout | |
const o = typeof url === 'string' ? URL.parse(url) : url; | |
!o.headers && (o.headers = { }); | |
!o.headers['user-agent'] && (o.headers['user-agent'] = 'WebExt update manifest generator/0.0.1'); | |
!o.timeout && (o.timeout = 30e3); | |
const request = https.get(o, response => { try { | |
const type = response.headers['content-type']; | |
if (response.statusCode !== 200) { | |
response.resume(); // consume response data to free up memory | |
throw new Error(`Status Code: ${response.statusCode}`); | |
} | |
if (!(/^application\/json(?:;|$)/).test(response.headers['content-type'])) { | |
response.resume(); // consume response data to free up memory | |
throw new TypeError(`Unexpected MIME-Type: ${type}`); | |
} | |
response.setEncoding('utf8'); | |
response.setTimeout(o.timeout, () => { reject(new Error(`Request timeout`)); request.abort(); }); | |
let data = ''; response.on('data', chunk => (data += chunk)); | |
response.on('end', () => { try { | |
resolve(JSON.parse(data)); | |
} catch (error) { reject(error); } }); | |
} catch (error) { reject(error); } }).on('error', reject) | |
.on('timeout', () => { reject(new Error(`Request timeout`)); request.abort(); }); | |
}); } |
{ | |
"name": "web-ext-updater", | |
"version": "1.0.0", | |
"description": "Tiny server to facilitate automatic updates for WebExtensions released on GitHub", | |
"keywords": [ "Firefox", "WebExtension", "GitHub", "update", "development" ], | |
"author": "NiklasGollenstede", | |
"license": "MIT", | |
"repo": "gist:60aa2dc957f985eff2b7a2655ea1092b", | |
"homepage": "https://gist.github.com/NiklasGollenstede/60aa2dc957f985eff2b7a2655ea1092b#file-readme-md", | |
"main": "index.js", "scripts": { "start": "node index.js" } | |
} |