Skip to content

Instantly share code, notes, and snippets.

View heapwolf's full-sized avatar
🕷️
Vault of the Black Spiders

heapwolf heapwolf

🕷️
Vault of the Black Spiders
View GitHub Profile
@max-mapper
max-mapper / index.js
Last active May 9, 2021 02:20
fast loading of a large dataset into leveldb
// data comes from here http://stat-computing.org/dataexpo/2009/the-data.html
// download 1994.csv.bz2 and unpack by running: cat 1994.csv.bz2 | bzip2 -d > 1994.csv
// 1994.csv should be ~5.2 million lines and 500MB
// importing all rows into leveldb took ~50 seconds on my machine
// there are two main techniques at work here:
// 1: never create JS objects, leave the data as binary the entire time (binary-split does this)
// 2: group lines into 16 MB batches, to take advantage of leveldbs batch API (byte-stream does this)
var level = require('level')
  • All sessions are 30 minutes long
  • There will be no Q/A (a.k.a: please hang out with the attendees at our awesome bar and hack with them on any questions that might come up)
  • You must use our presenters computer
  • There will be a presenter remote
  • Your slides must be 1440x1080px

I know that not using your own computer can be painful but I will go length to install everything under the sun to make you feel as comfortable on that machine as possible.

@dominictarr
dominictarr / 000136.log
Last active August 28, 2024 21:16
corrupted level db
We couldn’t find that file to show.
@tj
tj / example.js
Created July 31, 2013 18:48
console.api()
function params(fn) {
var str = fn.toString();
var sig = str.match(/\(([^)]*)\)/)[1];
if (!sig) return [];
return sig.split(', ');
}
console.api = function(obj){
console.log();
var proto = Object.getPrototypeOf(obj);
@creationix
creationix / simple-stream.md
Last active December 19, 2015 05:18
Simple Streams

Simple Streams

Simple streams are a modification to [min-streams][] that aren't quite as minimal, but should be much easier to use with only a slight change in definition. After using min-streams for a while, the biggest issue is the confusion between the data channel and the close channel. Also there is no structural type to tell consumers this is a stream. It's just a function with little introspection.

A simple-stream is defined as an object with .read(callback) and .stop(err, callback) functions. These are functions, not methods. This means that you don't have to worry about binding them to the stream object before calling them or using them as callbacks to other functions.

A nice side effect of this new design is that the read function is a continuable already. Libraries that consume generators like [gen-run][] can yield on read directly for easy stream consumption.

var stream = {
@heapwolf
heapwolf / npm-qos-heuristic.md
Last active December 11, 2015 10:38
general heuristics for ranking package quality

Health

Has CI

Tests pass

Total number of breaking commits

Number of dependencies

Average age of issue

Frequency of issues fixed

Average response time of issues fixed to bugs filed

Last commit

The ultimate database of the future

What does the ultimate database look like? Maybe the ultimate database of the future going to be a many-headed hydra that will attempt to solve all problems? Maybe marketing teams are going to be at the heart of its success? Perhaps it will grow tentacles and become violent toward its creator?

Hydra

What does your current database do?

Unless you're a contributor to the software, your probably limited to understanding the value propositions and a subset of features. Maybe your an expert. Probably not. This corner is usually dark. A database has historically been a black box. You pick one that seems like the best fit and trust it.

Maybe the database of the future should just be a library, more like BerkeleyDB, I really don't want an entire server. Someone could just add a server on top of it if they really wanted one. It would also be nice if understanding it (in its entirety) wouldn't represent a big investment.

@max-mapper
max-mapper / readme.md
Last active June 3, 2020 00:31
automatically scan for and join open internet enabled wifi networks on linux using node.js (tested on raspberry pi raspbian)
@WebReflection
WebReflection / Object.extra.js
Created August 16, 2012 20:42
Object.getPropertyDescriptor and Object.getPropertyNames
!function(Object, getPropertyDescriptor, getPropertyNames){
// (C) WebReflection - Mit Style License
if (!(getPropertyDescriptor in Object)) {
var getOwnPropertyDescriptor = Object.getOwnPropertyDescriptor;
Object[getPropertyDescriptor] = function getPropertyDescriptor(o, name) {
var proto = o, descriptor;
while (proto && !(
descriptor = getOwnPropertyDescriptor(proto, name))
) proto = proto.__proto__;
return descriptor;
@TooTallNate
TooTallNate / why.md
Created May 26, 2012 18:28
Why is Node async?

Why is Node async?

  • Asynchronous code is how you write low-resource, high-concurrency servers. See http://www.kegel.com/c10k.html.
  • Node embracing async from the get-go means that servers are low-resource, high-concurrency by default.
  • Async + JavaScript = Perfect fit for an event loop.
  • When it comes to threads vs event-loop, there are times when either are advantageous.
    • But there are things very hard or impossible to do with threads.
    • WebSockets are difficult to do properly with threads. That's one example where non-blocking IO (async) has a major advantage.
    • RAM usage is another factor, especially when we're talking about the physical hardware required to run your application, which translates to real dollars.
  • It depends on your application in the end: