Skip to content

Instantly share code, notes, and snippets.

@jch
jch / .gemrc
Created November 1, 2011 19:10
gemrc example
# http://docs.rubygems.org/read/chapter/11
---
gem: --no-ri --no-rdoc
benchmark: false
verbose: true
update_sources: true
sources:
- http://gems.rubyforge.org/
- http://rubygems.org/
backtrace: true
@jareware
jareware / gist.md
Last active January 30, 2024 03:15
Project-specific lint rules with ESLint

⇐ back to the gist-blog at jrw.fi

Project-specific lint rules with ESLint

A quick introduction

First there was JSLint, and there was much rejoicing. The odd little language called JavaScript finally had some static code analysis tooling to go with its many quirks and surprising edge cases. But people gradually became annoyed with having to lint their code according to the rules dictated by Douglas Crockford, instead of their own.

So JSLint got forked into JSHint, and there was much rejoicing. You could set it up to only complain about the things you didn't want to allow in your project, and shut up about the rest. JSHint has been the de-facto standard JavaScript linter for a long while, and continues to do so. Yet there will always be things your linter could check for you, but doesn't: your team has agreed on some convention that makes sense for them, but JSHint doesn't have an option

@elrayle
elrayle / byebug_commands.md
Last active August 1, 2024 03:01
Byebug Cheatsheet - organized by related commands

Byebug Cheatsheet

This cheatsheet includes most of the byebug commands organized by related commands (e.g. breakpoint related commands are together).

To see official help...

Command Aliases Example Comments
help h h list top level of all commands
help cmd h cmd-alias h n list the details of a command (example shows requesting details for the next command) (this works for all commands)
@rain-1
rain-1 / LLM.md
Last active November 2, 2024 12:14
LLM Introduction: Learn Language Models

Purpose

Bootstrap knowledge of LLMs ASAP. With a bias/focus to GPT.

Avoid being a link dump. Try to provide only valuable well tuned information.

Prelude

Neural network links before starting with transformers.