Skip to content

Instantly share code, notes, and snippets.

@bkardell
Created December 5, 2013 14:48
Show Gist options
  • Save bkardell/7806243 to your computer and use it in GitHub Desktop.
Save bkardell/7806243 to your computer and use it in GitHub Desktop.
An optimization debate

Topic:

CDNs and other optimization techniques. This comes up a lot, it crosses numerous mailing lists and twitter. If you have thoughts on this, let's discuss and we can easily cite/refernce in the future....

Here is a statement from @scott_gonzales on twitter, and some thoughts from me to open the discussion

Scott: CDNs have much higher cache-miss rates than you'd think, and JS should be concatenated and deployed from the server

Me: It's true cache-misses are higher, but I don't want to throw the baby out with the bathwater. The advantages of concat will largely disappear with HTTP2. CDNs have a number of things (some in theory and some in practice) which seem good. At an incredibly utilitarian level, if I can offload that from my own infrastructure and maybe reduce hops for this requests too - seems good. At a more conceptual level, the idea that some resources are highly shareble and deserve a special home/cache seems good even if CDNs don't currently fully enable that - seems maybe not so much a problem with the CDN as much as one for the platform to help tackle. It does seem ridiculous to send signficant capabilities like jQuery, Ember or Angular down over and over and ask them to eat up cache space in my own domain. It really seems like there should be 1 and only 1 version of content called jQuery-x.y.z.js

@bkardell
Copy link
Author

bkardell commented Dec 5, 2013

@scottgonzalez - I think we are actually in agreement about most of that - Clearly is isn't paying off as much as people initially thought - but why? Maybe becache plays for so many things and it is limited per-domain, etc, etc that there are too many competing factors... The more people give good cache advice, the more it further competes for that limited space, etc. Since CDNs themselves are competing and they host much of the same stuff, it plays against you, etc. It does seem though that some things really are "different" and that some of the goals of hosting things on CDNs are valid and deserve further research/attempts. We have some code/goals/thoughts around this in which you ask for something more NPM style and use something like ServiceWorker to deal with the fact that we know this could be served from a number of places. If you think about it, every version of jquery ever released + every version of jquery UI and assets, all put together is still a pretty negligible amount of space, plus their popularity will kinda cycle in the system - it's unfortunate that we aren't kind of smarter about that - there is really no reason to have N identical copies of a source file. It's tricky to fix, but I think the advantages of getting the sort of "unofficial bits of the internet" "closer to the metal" would be a worthwhile thing..

@yoavweiss
Copy link

@scottgonzalez - If the choice is between a slimmed down self-hosted version and a bloated CDN version, then it's a no-brainer IMO (even if browser cache rates are higher on the CDN)

@bkardell - I don't believe that cache space is the issue here. Caches shouldn't evict popular resources.
I think that the problem is the multitude of framework versions + the fact that there are several "official" CDNs result in the fact that your particular choice of version+CDN is not shared by many other sites.
That means that the % of first-time-users-that-saw-your-framework-elsewhere is not that high, and even there, this particular framework+CDN combination is not recognized as a highly-popular resource (because it isn't) and may be evicted rather fast, to make space for some popular cat photos.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment