This document is an attempt to pin down all the things you don't think about when quoting for a project, and hopefully provide a starting point for some kind of framework to make quoting, working and delivering small-medium jobs more predictable and less stressful.
/** | |
* Helper function to reduce boilerplate in route creation | |
* | |
* @param {string} path The route's path | |
* @param {object} page A page component definition | |
* @param {Function} page A function that returns a page import | |
* @param {string} page A string path to a file in the view/pages folder | |
* @param {object} attrs Any additional attributes | |
*/ | |
export function route (path, page, attrs = {}) { |
title: 'Site Settings' | |
hide: true | |
sections: | |
main: | |
fields: | |
head_scripts: | |
mode: table | |
fields: | |
description: | |
type: text |
<?php | |
use PhpCsFixer\Config; | |
use PhpCsFixer\Finder; | |
$rules = [ | |
'array_indentation' => true, | |
'array_syntax' => ['syntax' => 'short'], | |
'binary_operator_spaces' => [ | |
'default' => 'single_space', |
Backblaze's bztransmit process loads a file called bzfileids.dat into RAM. This file is a list of all files Backblaze has previously uploaded, including a unique identifier for each file. On most systems, this files is under 100MB in size (paraphrased from Backblaze support rep Zack).
Mine had grown to 6GB. This means that anytime bztransmit runs, it will load this 6GB file into RAM while it is backing up. In doing so it was purging massive ammounts of memory causing behavior like Chrome (usign 10GB of memory on it's own) to hang/beachball for 30 seconds and then refresh all it's windows.
There is no way to alter this behavior once it's begun, aside from starting over with some files excluded. The index needs to be rebuilt from scratch without the excessibe file count, that also means you can't restart and "inherit" a previous backup.
In my case the biggest culprits were .git and node_modules, so I excluded those, started a new backup (transfered licnese) and spent a week hunting for fast internet I could
<?php | |
namespace App\Providers; | |
use Illuminate\Http\Request; | |
use Illuminate\Routing\Route; | |
use Illuminate\Support\ServiceProvider; | |
use App\Http\Middleware\CaptureRequestExtension; | |
class AppServiceProvider extends ServiceProvider |
let webpack = require('webpack'); | |
let path = require('path'); | |
module.exports = { | |
entry: { | |
app: './resources/assets/js/app.js', | |
vendor: ['vue', 'axios'] | |
}, | |
output: { |
// Vanilla version of FitVids | |
// Still licencened under WTFPL | |
// | |
// Not as robust and fault tolerant as the jQuery version. | |
// It's BYOCSS. | |
// And also, I don't support this at all whatsoever. | |
;(function(window, document, undefined) { | |
'use strict'; | |
{# usage: {{ siteMacros.related_categories( entry, true, 'rubrics', 'rubric-list', 'Рубрики:', '?news=true' ) }} #} | |
{# in main layout: {% import '_partials/macros' as siteMacros %} #} | |
{% macro related_categories(entryModel, isSection=false, catGroupName='categories', className='categories', headingText='', appendToURL='') %} | |
{% set categories = craft.categories.group(catGroupName).relatedTo(entryModel) %} | |
{% if categories | length %} | |
{% if isSection %} | |
<section class="{{ className }}"> | |
<h1>{{headingText}}</h1> |
# You don't need Fog in Ruby or some other library to upload to S3 -- shell works perfectly fine | |
# This is how I upload my new Sol Trader builds (http://soltrader.net) | |
# Based on a modified script from here: http://tmont.com/blargh/2014/1/uploading-to-s3-in-bash | |
S3KEY="my aws key" | |
S3SECRET="my aws secret" # pass these in | |
function putS3 | |
{ | |
path=$1 |