I hereby claim:
- I am bpo on github.
- I am bpo (https://keybase.io/bpo) on keybase.
- I have a public key whose fingerprint is 4B0B 3036 5986 CCA7 C828 DE7C 8B01 56B9 5A56 2B86
To claim this, I am signing this object:
redis = Redis.new(url: REDISGREEN_URL) | |
cursor = 0 | |
while true do | |
cursor, keys = redis.scan(cursor) | |
if keys.size < 1 | |
break | |
end | |
keys.each do |key| |
#!/usr/bin/env ruby | |
# 1. Add your environment variable keys below to check them | |
# for connectivity. | |
CHECK = [ | |
'REDISGREEN_URL', | |
# Any other services external to Heroku should have their URLs here | |
] |
# Read this aloud. | |
5.minutes.ago > 10.minutes.ago | |
=> true |
:chef-solo (master)$ env | grep -i vagrant | |
:chef-solo (master)$ time vagrant --version | |
Vagrant 1.6.5 | |
real 0m0.262s | |
user 0m0.229s | |
sys 0m0.029s | |
:chef-solo (master)$ export VAGRANT_CHECKPOINT_DISABLE=1 | |
:chef-solo (master)$ time vagrant --version | |
Vagrant 1.6.5 |
:chef-solo (master)$ env | grep -i vagrant | |
:chef-solo (master)$ time vagrant status | |
Current machine states: | |
default running (vmware_fusion) | |
The VM is running. To stop this VM, you can run `vagrant halt` to | |
shut it down, or you can run `vagrant suspend` to simply suspend | |
the virtual machine. In either case, to restart it again, run | |
`vagrant up`. |
INFO loader: Set :root = #<Pathname:/Users/bpo/x/chef-solo/Vagrantfile> | |
DEBUG loader: Populating proc cache for #<Pathname:/Users/bpo/x/chef-solo/Vagrantfile> | |
DEBUG loader: Load procs for pathname: /Users/bpo/x/chef-solo/Vagrantfile | |
INFO loader: Loading configuration in order: [:home, :root] | |
DEBUG loader: Loading from: root (evaluating) | |
DEBUG provisioner: Provisioner defined: chef_solo | |
DEBUG loader: Configuration loaded successfully, finalizing and returning | |
ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEAo5dm6zR3pd9L3pkgD37qh5ffvsu/4CA7zP5gzkL85ZkGB+EB83mWFCpduIL00fpM1lYURZ8uYtskmbLcoTZgKMd7VWJBklXzn+VMGFDyauyoQPSpiEIc0wliyHsqSDiOiTdwSWSmO3hfDp8Yc433QcqltMmbktpTCyLViLR3stp2D/V7mk/GsMVXFyUBvVrODRENMVhf643QhYEoNJOCDLm6UdcEPc1xb3Rs1oqOhJvDT2gz67DrpQ7t8+sh7ULcg+7oHVfGNW1grecYm0u8umPUspJtAmKLBgptFEEFoasbGurOXHufC1cQEFlN09aMy5zoxVlAkVIMYE1ekcgJwQ== [email protected] |
I hereby claim:
To claim this, I am signing this object:
:tf-testing (master)$ terraform show terraform.tfstate | |
aws_instance.example: | |
id = i-5a8e9471 | |
ami = ami-408c7f28 | |
availability_zone = us-east-1c | |
instance_type = t1.micro | |
key_name = | |
private_dns = ip-172-31-23-101.ec2.internal | |
private_ip = 172.31.23.101 | |
public_dns = ec2-54-210-135-203.compute-1.amazonaws.com |
2014/07/10 17:01:55 Packer Version: 0.6.0 12e28f257f66299e3bb13a053bf06ccd236e7efd | |
2014/07/10 17:01:55 Packer Target OS/Arch: darwin amd64 | |
2014/07/10 17:01:55 Built with Go Version: go1.2 | |
2014/07/10 17:01:55 Detected home directory from env var: /Users/bpo | |
2014/07/10 17:01:55 Attempting to open config file: /Users/bpo/.packerconfig | |
2014/07/10 17:01:55 File doesn't exist, but doesn't need to. Ignoring. | |
2014/07/10 17:01:55 Packer config: &{PluginMinPort:0 PluginMaxPort:0 Builders:map[googlecompute:packer-builder-googlecompute qemu:packer-builder-qemu parallels-iso:packer-builder-parallels-iso parallels-pvm:packer-builder-parallels-pvm null:packer-builder-null amazon-chroot:packer-builder-amazon-chroot amazon-instance:packer-builder-amazon-instance digitalocean:packer-builder-digitalocean openstack:packer-builder-openstack virtualbox-iso:packer-builder-virtualbox-iso virtualbox-ovf:packer-builder-virtualbox-ovf amazon-ebs:packer-builder-amazon-ebs docker:packer-builder-docker vmware-iso:packer-builder-vmware-is |