Skip to content

Instantly share code, notes, and snippets.

View slattery's full-sized avatar

Mike Slattery slattery

View GitHub Profile
$scope.safeApply = function(fn) {
var phase = this.$root.$$phase;
if(phase == '$apply' || phase == '$digest')
this.$eval(fn);
else
this.$apply(fn);
};
// OR
@slattery
slattery / pm2_actionhero_startup.md
Last active September 16, 2015 16:07
pm2 actionhero startup

I like using pm2 to manage node starts and restarts, including actionhero.

Node falls right on its face if anyone's code goes through an uncaught exception. We're not at all used to that in Apacheland. pm2 does whatever forever does for node, plus some more. pm2 will output a nice report for web boards, etc. And pm2 is smart about restarting, as well as bailing out if the app is faulty and restarts 15 times in a row due to a launch error. Plus it will combine log output for us if we are running a few servers on the machine.

# NOTE: you may need g++ if it is not installed
# apt-get install g++
npm install -g pm2
pm2 startup ubuntu
CREATE OR REPLACE FUNCTION cut_nicely(my_string VARCHAR, my_length INTEGER) RETURNS varchar AS $$
DECLARE
my_pointer INTEGER;
BEGIN
my_pointer := my_length;
WHILE my_pointer < length(my_string) AND transliterate(substr(my_string, my_pointer, 1)) ~* '[a-z]' LOOP
my_pointer := my_pointer + 1;
END LOOP;
RETURN substr(my_string, 1, my_pointer);
-- PostgreSQL 9.2 beta (for the new JSON datatype)
-- You can actually use an earlier version and a TEXT type too
-- PL/V8 http://code.google.com/p/plv8js/wiki/PLV8
-- Inspired by
-- http://people.planetpostgresql.org/andrew/index.php?/archives/249-Using-PLV8-to-index-JSON.html
-- http://ssql-pgaustin.herokuapp.com/#1
-- JSON Types need to be mapped into corresponding PG types
--
-- Function: json_path(json, text)
-- DROP FUNCTION json_path(json, text);
CREATE OR REPLACE FUNCTION json_path(data json, path text)
RETURNS text AS
$BODY$
/* JSONPath 0.8.0 - XPath for JSON
*
* Copyright (c) 2007 Stefan Goessner (goessner.net)

Some thoughts on using node-postgres in a web application

This is the approach I've been using for the past year or so. I'm sure I'll change and it will change as I grow & am exposed to more ideas, but it's worked alright for me so far.

Pooling:

I would definitely use a single pool of clients throughout the application. node-postgres ships with a pool implementation that has always met my needs, but it's also fine to just use the require('pg').Client prototype and implement your own pool if you know what you're doing & have some custom requirements on the pool.

CREATE OR REPLACE FUNCTION public.json_append(data json, insert_data json)
RETURNS json
LANGUAGE sql
AS $$
SELECT ('{'||string_agg(to_json(key)||':'||value, ',')||'}')::json
FROM (
SELECT * FROM json_each(data)
UNION ALL
SELECT * FROM json_each(insert_data)
) t;
@slattery
slattery / getmetheearliestone.sql
Created December 12, 2014 03:47
get first in each category in PostgreSQL
SELECT DISTINCT ON (message_id) message_id, date_trunc('day', load_time) as load_time
FROM www_acp.outgoing_messages
WHERE message_id IN (select document_id from documents where name ilike '%whatevz%')
ORDER BY message_id, load_time
-- this acts sort of like a group by with min( timestamp )
@slattery
slattery / pluckgitdir
Created December 17, 2014 19:30
pluck a dir from one git repo and make a fresh one
thx http://gbayer.com/development/moving-files-from-one-git-repository-to-another-preserving-history/
Make a copy of repository A so you can mess with it without worrying about mistakes too much. It’s also a good idea to delete the link to the original repository to avoid accidentally making any remote changes (line 3). Line 4 is the critical step here. It goes through your history and files, removing anything that is not in directory 1. The result is the contents of directory 1 spewed out into to the base of repository A. You probably want to import these files into repository B within a directory, so move them into one now (lines 5/6). Commit your changes and we’re ready to merge these files into the new repository.
```sh
git clone <git repository A url>
cd <git repository A directory>
git remote rm origin
git filter-branch --subdirectory-filter <directory 1> -- --all
mkdir <directory 1>
@slattery
slattery / cloneoldbaregitrepo
Last active August 29, 2015 14:11
populate github fresh repo with old existing stuff
Create a new repository on GitHub. You'll import your external Git repository to this new repository.
On the command line, make a "bare" clone of the repository using the external clone URL. This creates a full copy of the data, but without a working directory for editing files, and ensures a clean, fresh export of all the old data.
git clone --bare https://githost.org/extuser/repo.git
# Makes a bare clone of the external repository in a local directory
Push the locally cloned repository to GitHub using the "mirror" option, which ensures that all references, such as branches and tags, are copied to the imported repository.
cd *repo.git*