autoscale: true
- caching*
- search*
- installer support
^ packages, Git, ETags ^ the less said about search, the better
- package metadata (
package.json
*) - registry metadata (HTTP ETags)
- package tarballs
- installer lockfiles
- Git remotes
- search data*
^ not the same as what's in your package, or on the registry (_from
, other headers)
^ registry metadata is the above plus the ETag
- packages from a registry
- hosted Git repositories
- local directories
- local tarballs
- remote tarballs via HTTP[S] URLs
^ not just GitHub – gitlab:
, gist:
, and bitbucket:
supported as well
- all installs go through the cache
- only hit network if registry metadata is stale
- only receive new packages if ETag is stale (no 302)
- always update Git clones
^ the caching logic figures out how to convert your registry requests into an installable package ^ as such, almost as complicated as the installer
npm install
npm cache
npm cache clean
npm cache ls
npm cache add
npm search <thing>
*
npm cache clean <package>/<version>
npm cache clean
sudo rm -rf "$(npm config get cache)"
rm "$(npm config get cache)/_locks/*"
^ npm doesn't know what's a commit and what's a branch or tag ^ I looked up the correct spelling
- for shrinkwrap, if any dependency fails, whole install fails
- very large packages are difficult to publish & install
- not designed for offline use
--cache-min=999999
will fake it- …but only if the cache is warmed for the packages you want
^ blows up if you try to install a package / version not found in cache
- content-addressable cache
- cacheable shrinkwrap
- true offline mode
- decouple from the installer &
npm-registry-client
^ a true programmatic interface to the cache would be very handy ^ I've been trying to get to this for years ;_;
Twitter: @othiym23 GitHub: @othiym23 email: [email protected]