Skip to content

Instantly share code, notes, and snippets.

@theburningmonk
Created August 30, 2017 01:37
Show Gist options
  • Save theburningmonk/d5e7e0a4cb0558fc9ae74c327726c821 to your computer and use it in GitHub Desktop.
Save theburningmonk/d5e7e0a4cb0558fc9ae74c327726c821 to your computer and use it in GitHub Desktop.
'use strict';
const co = require('co');
const EventEmitter = require('events');
const Promise = require('bluebird');
const AWS = require('aws-sdk');
const ssm = Promise.promisifyAll(new AWS.SSM());
const DEFAULT_EXPIRY = 3 * 60 * 1000; // default expiry is 3 mins
function loadConfigs (keys, expiryMs) {
expiryMs = expiryMs || DEFAULT_EXPIRY; // defaults to 3 mins
if (!keys || !Array.isArray(keys) || keys.length === 0) {
throw new Error('you need to provide a non-empty array of config keys');
}
if (expiryMs <= 0) {
throw new Error('you need to specify an expiry (ms) greater than 0, or leave it undefined');
}
// the below uses the captured closure to return an object with a gettable
// property per config key that on invoke:
// * fetch the config values and cache them the first time
// * thereafter, use cached values until they expire
// * otherwise, try fetching from SSM parameter store again and cache them
let cache = {
expiration : new Date(0),
items : {}
};
let eventEmitter = new EventEmitter();
let validate = (keys, params) => {
let missing = keys.filter(k => params[k] === undefined);
if (missing.length > 0) {
throw new Error(`missing keys: ${missing}`);
}
};
let reload = co.wrap(function* () {
console.log(`loading cache keys: ${keys}`);
let req = {
Names: keys,
WithDecryption: true
};
let resp = yield ssm.getParametersAsync(req);
let params = {};
for (let p of resp.Parameters) {
params[p.Name] = p.Value;
}
validate(keys, params);
console.log(`successfully loaded cache keys: ${keys}`);
let now = new Date();
cache.expiration = new Date(now.getTime() + expiryMs);
cache.items = params;
eventEmitter.emit('refresh');
});
let getValue = co.wrap(function* (key) {
let now = new Date();
if (now <= cache.expiration) {
return cache.items[key];
}
try {
yield reload();
return cache.items[key];
} catch (err) {
if (cache.items && cache.items.length > 0) {
// swallow exception if cache is stale, as we'll just try again next time
console.log('[WARN] swallowing error from SSM Parameter Store:\n', err);
eventEmitter.emit('refreshError', err);
return cache.items[key];
}
console.log(`[ERROR] couldn't fetch the initial configs : ${keys}`);
console.error(err);
throw err;
}
});
let config = {
onRefresh : listener => eventEmitter.addListener('refresh', listener),
onRefreshError : listener => eventEmitter.addListener('refreshError', listener)
};
for (let key of keys) {
Object.defineProperty(config, key, {
get: function() { return getValue(key); },
enumerable: true,
configurable: false
});
}
return config;
}
module.exports = {
loadConfigs
};
@theburningmonk
Copy link
Author

You should use middy's ssm middleware instead of doing this yourself: https://github.com/middyjs/middy/blob/master/docs/middlewares.md#ssm

But if you do want to use this then you should replace the co.wrap and yield with async and await respectively, then you can use it like this:

const configs = loadConfigs(['key1', 'key2']);
const key1 = await configs.key1;
const key2 = await configs.key2;

@sappusaketh
Copy link

Thank you for the response. I m able to convert it to async/await and make it work but my question is for each and every new request it is making a request to ssm. The cache is getting cleared after a request. I am able to mock the aws environment via local stack and serverless framework.

@theburningmonk
Copy link
Author

I would suggest using the middy middleware instead, or look at its implementation https://github.com/middyjs/middy/blob/master/src/middlewares/ssm.js

This was a simple demonstration of how you'd do such a cache client.

@sappusaketh
Copy link

My async/await method is working I got confused because I can mock the whole aws environment locally but I cant really check the caching locally. I forgot that caching will be enabled on aws lambda hosted containers so when I checked in my cloudwatch logs caching is working so once again thank you very much

@theburningmonk
Copy link
Author

No worries :-) glad it's working for you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment