Commands, questions and easter eggs for Amazon Alexa enabled devices: https://ugotsta.github.io/alexa-cheats/
- "Alexa, stop."
- "Alexa, volume one/six/ten."
- "Alexa, turn up/down the bass/treble."
- "Alexa, mute."
- "Alexa, unmute."
- "Alexa, repeat."
Commands, questions and easter eggs for Amazon Alexa enabled devices: https://ugotsta.github.io/alexa-cheats/
import java.util.HashMap; | |
/** | |
* Created by Brandon Markwalder on 5/11/2017. | |
* Write a function done_or_not passing a board (list[list_lines]) as parameter. | |
* If the board is valid return 'Finished!', otherwise return 'Try again!' | |
* | |
* Sudoku rules: | |
* | |
* ROWS: |
exFAT support on macOS seems to have some bugs because my external drives with exFAT formatting will randomly get corrupted.
If Disk Utility is unable to repair, consider trying this:
diskutil list
to find the right drive id.disk1s1
sudo fsck_exfat -d <id from above>
. eg sudo fsck_exfat -d disk1s3
-d
is debug so you'll see all your files output as they're processed.Here are the simple steps needed to create a deployment from your local GIT repository to a server based on this in-depth tutorial.
You are developing in a working-copy on your local machine, lets say on the master branch. Most of the time, people would push code to a remote server like github.com or gitlab.com and pull or export it to a production server. Or you use a service like deepl.io to act upon a Web-Hook that's triggered that service.
#!/bin/bash | |
# Stop all containers | |
containers=`docker ps -a -q` | |
if [ -n "$containers" ] ; then | |
docker stop $containers | |
fi | |
# Delete all containers | |
containers=`docker ps -a -q` | |
if [ -n "$containers" ]; then | |
docker rm -f -v $containers |
# You don't need Fog in Ruby or some other library to upload to S3 -- shell works perfectly fine | |
# This is how I upload my new Sol Trader builds (http://soltrader.net) | |
# Based on a modified script from here: http://tmont.com/blargh/2014/1/uploading-to-s3-in-bash | |
S3KEY="my aws key" | |
S3SECRET="my aws secret" # pass these in | |
function putS3 | |
{ | |
path=$1 |
-- This SQL file will remove all users for a specific blog from the network tables (`wp_users` and `wp_usermeta`) | |
-- Set the value of `@newBlogID` to the ID of the blog for which you want to remove all users. | |
-- Useful for reimporting content and users/rerunning a migration. | |
@newBlogID = TKTK; | |
DROP TEMPORARY TABLE IF EXISTS temp_user_ids; | |
CREATE TEMPORARY TABLE IF NOT EXISTS temp_user_ids SELECT user_id as ID FROM wp_usermeta WHERE meta_key = CONCAT('wp_', @newBlogID, '_capabilities'); | |
DELETE FROM wp_users WHERE ID in (SELECT ID from temp_user_ids); | |
DELETE FROM wp_usermeta WHERE user_id in (SELECT ID from temp_user_ids); |
module.exports = { | |
easeIn: 'ease-in', | |
easeOut: 'ease-out', | |
easeInOut: 'ease-in-out', | |
snap: [0.000, 1.000, 0.500, 1.000], | |
linear: [0.250, 0.250, 0.750, 0.750], | |
easeInQuad: [0.550, 0.085, 0.680, 0.530], | |
easeInCubic: [0.550, 0.055, 0.675, 0.190], | |
easeInQuart: [0.895, 0.030, 0.685, 0.220], | |
easeInQuint: [0.755, 0.050, 0.855, 0.060], |
#!/bin/sh | |
# copied from http://zeroset.mnim.org/2013/03/14/sftp-support-for-curl-in-ubuntu-12-10-quantal-quetzal-and-later/ | |
mkdir /tmp/curl | |
cd /tmp/curl | |
sudo apt-get update | |
sudo apt-get install build-essential debhelper libssh2-1-dev | |
apt-get source curl | |
sudo apt-get build-dep curl | |
cd curl-* | |
dpkg-buildpackage |
vagrant.dev
with the IP address 192.168.33.10 (See your Vagrantfile
and /etc/hosts
)network.org
and the project site is project.org
admin
and whose password is password
.vagrant plugin install vagrant-vbox-snapshot
from the Vagrant host. To take a snapshot, run vagrant snapshot take default [optional-name-for-snapshot]
. Take one before starting, and then at the start of every major section here.