Skip to content

Instantly share code, notes, and snippets.

View ajmalafif's full-sized avatar
💻

Ajmal Afif ajmalafif

💻
View GitHub Profile
@ajmalafif
ajmalafif / gulpfile.js
Created October 6, 2015 04:03 — forked from martinwolf/gulpfile.js
Jekyll, Browsersync and Gulp
var gulp = require('gulp'),
sass = require('gulp-ruby-sass'),
autoprefixer = require('gulp-autoprefixer'),
minifycss = require('gulp-minify-css'),
jshint = require('gulp-jshint'),
uglify = require('gulp-uglify'),
rename = require('gulp-rename'),
clean = require('gulp-clean'),
concat = require('gulp-concat'),
notify = require('gulp-notify'),
@ajmalafif
ajmalafif / supply-collection-sidebar
Created October 4, 2015 08:27 — forked from cshold/supply-collection-sidebar
The sidebar for Shopify's Supply theme, including code to handle an advanced tagging system.
{% comment %}
A customized sidebar for this theme. If advanced tagging is enabled in
theme settings, prepend your tags with "group" names (E.g. BRAND_) and your
collection page will create groups of tags to sort by.
Expansion of https://gist.github.com/darryn/8047749
{% endcomment %}
@ajmalafif
ajmalafif / asset_sync_is_the_devil.md
Last active August 31, 2015 07:20 — forked from schneems/asset_sync_is_the_devil.md
I fucking hate asset_sync

A not politically correct assertion of my feelings towards a piece of software:

Note: Repetition builds cynicism, asset_sync isn't bad, but when an asset problem cannot be solved via support it gets escalated to me. Often times someone using asset_sync the problem is due to their use of the library and not from Heroku.

Backstory

The asset sync gem uploads your assets (images, css, javascript) to S3. From there you can either point browsers to the copy on S3 or use a CDN + the S3 bucket. It's a good idea, and solved a problem at one time.

It is no longer needed and you should now use https://devcenter.heroku.com/articles/using-amazon-cloudfront-cdn instead. So rather than copying your assets over to S3 after they are precompiled the CDN grabs them from your website instead. Here's some reasons why it's better.

@ajmalafif
ajmalafif / perfschool-pagespeed
Last active August 29, 2015 14:27 — forked from taesup/perfschool-pagespeed
answer to perfschool problem 1
'use strict';
var fs = require('fs');
var path = require('path');
var express = require('express');
var psi = require('psi');
var app = express();
var port = process.env.PORT || 7777;
app.get('/', home);
<script>
console.log("--- HARP Debugger --");
console.log(<%- JSON.stringify(inspect) %>);
// Just check your console in chrome to see and inspect the variable.
</script>
/**
* For more information see this tutorial: http://blog.webbb.be/use-jekyll-with-gulp/
*
* Libs import
* --> How to install? npm install --save-dev gulp-minify-html
* @type {[type]}
*/
var gulp = require('gulp'),
path = require('path'),
def self.find_for_oauth(auth, signed_in_resource = nil)
# Get the identity and user if they exist
identity = Identity.find_for_oauth(auth)
user = identity.user
user = signed_in_resource if user.nil?
if user.nil?
# Get the existing user by email if the OAuth provider gives us a verified email
#!/usr/bin/env ruby
# If you're not using rbenv in this script's dir, you may wanna run
# these as `sudo gem install ruby-trello`, etc.
['ruby-trello', 'dotenv'].each do |gem_name|
begin
gem gem_name
rescue LoadError
puts "Running `gem install #{gem_name}`..."
puts "If this takes too long, you may want to run it manually, as `sudo` if needed."
@ajmalafif
ajmalafif / post-checkout.rb
Last active August 29, 2015 14:08 — forked from gavinballard/post-checkout.rb
[git] — post checkout hook
#!/usr/bin/env ruby-rvm-env 1.9.3
require 'yaml'
# Get the "type" of checkout from the arguments Git passes to us.
# Possible values for this are "0" for a file-only checkout (which we dont' care about)
# or "1" for a full branch checkout (which we do).
checkout_type = ARGV[2]
if checkout_type == "1"
@ajmalafif
ajmalafif / commands.md
Last active August 29, 2015 14:08 — forked from weotch/commands.md

Since you can't ssh from PagodaBox (the binary isn't installed), you can't rsync or scp between servers. Thus you need to copy them locally and then upload back up

  1. Enable ssh in PB for both servers.
  2. SSH to the first server and tar the uploads dir: tar -cvf uploads.tar shared/public/uploads. Not bothering with zipping causing it would take along time and it's all binary so I don't expect much compression help.
  3. Download the uploads.tar file to your comp.
  4. Delete uploads.tar file from server.
  5. Upload uploads.tar file to the root of second server.
  6. Connect to second server via SSH
  7. Uncompress it: tar -xvf uploads.tar
  8. Delete uploads.tar file from the second server