⚠️ Note 2023-01-21
Some things have changed since I originally wrote this in 2016. I have updated a few minor details, and the advice is still broadly the same, but there are some new Cloudflare features you can (and should) take advantage of. In particular, pay attention to Trevor Stevens' comment here from 22 January 2022, and Matt Stenson's useful caching advice. In addition, Backblaze, with whom Cloudflare are a Bandwidth Alliance partner, have published their own guide detailing how to use Cloudflare's Web Workers to cache content from B2 private buckets. That is worth reading,
/** | |
* @file get/set caret position and insert text | |
* @author islishude | |
* @license MIT | |
*/ | |
export class Caret { | |
/** | |
* get/set caret position | |
* @param {HTMLColletion} target | |
*/ |
FWIW: I (@rondy) am not the creator of the content shared here, which is an excerpt from Edmond Lau's book. I simply copied and pasted it from another location and saved it as a personal note, before it gained popularity on news.ycombinator.com. Unfortunately, I cannot recall the exact origin of the original source, nor was I able to find the author's name, so I am can't provide the appropriate credits.
- By Edmond Lau
- Highly Recommended 👍
- http://www.theeffectiveengineer.com/
Around 2006-2007, it was a bit of a fashion to hook lava lamps up to the build server. Normally, the green lava lamp would be on, but if the build failed, it would turn off and the red lava lamp would turn on.
By coincidence, I've actually met, about that time, (probably) the first person to hook up a lava lamp to a build server. It was Alberto Savoia, who'd founded a testing tools company (that did some very interesting things around generative testing that have basically never been noticed). Alberto had noticed that people did not react with any urgency when the build broke. They'd check in broken code and go off to something else, only reacting to the breakage they'd caused when some other programmer pulled the change and had problems.
""" | |
Minimal character-level Vanilla RNN model. Written by Andrej Karpathy (@karpathy) | |
BSD License | |
""" | |
import numpy as np | |
# data I/O | |
data = open('input.txt', 'r').read() # should be simple plain text file | |
chars = list(set(data)) | |
data_size, vocab_size = len(data), len(chars) |
var active = false; | |
function changeRefer(details) { | |
if (!active) return; | |
for (var i = 0; i < details.requestHeaders.length; ++i) { | |
if (details.requestHeaders[i].name === 'Referer') { | |
details.requestHeaders[i].value = 'http://www.google.com/'; | |
break; | |
} |
* { | |
font-size: 12pt; | |
font-family: monospace; | |
font-weight: normal; | |
font-style: normal; | |
text-decoration: none; | |
color: black; | |
cursor: default; | |
} |
// by d whyte | |
int[][] result; | |
float t; | |
float ease(float p) { | |
return 3*p*p - 2*p*p*p; | |
} | |
float ease(float p, float g) { |
These are my notes basically. At first i created this gist just as a reminder for myself. But feel free to use this for your project as a starting point. If you have questions you can find me on twitter @thomasf https://twitter.com/thomasf This is how i used it on a Debian Wheezy testing (https://www.debian.org/releases/testing/)
Discuss, ask questions, etc. here https://news.ycombinator.com/item?id=7445545