Moving forward, updates to this subflow will be found at the following repository: https://github.com/sstratoti/actionable-notifications-subflow-for-ios
// PUT YOUR AWS ACCOUNT NUMBER HERE | |
var AWS_ACCOUNT_ID= '12345'; | |
// PUT YOUR SQS QUEUE NAME HERE | |
var AWS_SQS_QUEUE_NAME='catch-dlr-dyer-testing'; | |
var QUEUE_URL = 'https://sqs.us-east-1.amazonaws.com/' + AWS_ACCOUNT_ID + '/' + AWS_SQS_QUEUE_NAME; | |
var AWS = require('aws-sdk'); | |
var sqs = new AWS.SQS({region : 'us-east-1'}); |
package main | |
import ( | |
"crypto/tls" | |
"crypto/x509" | |
"flag" | |
"io/ioutil" | |
"log" | |
"net/http" | |
) |
Using the nc command you can scan a port or a range of ports to verify whether a UDP port is open and able to receive traffic.
This first command will scan all of the UDP ports from 1 to 65535 and add the results to a text file:
$ nc -vnzu server.ip.address.here 1-65535 > udp-scan-results.txt
This merely tells you that the UDP ports are open and receive traffic.
Perhaps a more revealing test would be to actually transfer a file using UDP.
# jdyer at MacBook-Pro.local in ~/Projects/consul [15:56:56] | |
$ dig @localhost -p 8600 _sip._udp.service.consul srv | |
; <<>> DiG 9.10.0-P2 <<>> @localhost -p 8600 _sip._udp.service.consul srv | |
; (3 servers found) | |
;; global options: +cmd | |
;; Got answer: | |
;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 55514 | |
;; flags: qr aa rd; QUERY: 1, ANSWER: 1, AUTHORITY: 0, ADDITIONAL: 1 | |
;; WARNING: recursion requested but not available |
# example location parts of nginx.conf | |
# add your own AWS keys, server lines etc, and set your aws domains, paths | |
http { | |
# you will need the luacrypto in the cpath, download from http://luacrypto.luaforge.net/ | |
lua_package_cpath "/home/justin/lua/luacrypto-0.2.0/src/l?.so.0.2.0;;"; | |
server { | |
listen 80; |
Operation: Decouple whisper from graphite.
Method: Create a graphite function that does a date histogram facet query against elasticsearch for a given query string for the time period viewed in the current graph.
Reason: graphite has some awesome math functions. Wouldn't it be cool if we could use those on logstash results?
The screenshot below is using logstash to watch the twitter stream of keywords "iphone" "apple" and "samsung" - then I graph them each, so we get an idea of popularity. As a bonus, I also do a movingAverage() on the iphone curve to show you why this is awesome.
#!/usr/bin/env ruby | |
def yield_noargs | |
puts "hi from yield_noargs" | |
yield if block_given? | |
end | |
def yield_args(foo) | |
puts "hi from yield_args" | |
puts "Was pased #{foo}" |
# Rake task to copy local files to remote server via FTP | |
# required credentials.yml file, that contains keys: | |
# server, username, password | |
require "net/ftp" | |
require "yaml" | |
class FTPClient | |
attr_reader :remote_path |
## mysql::master | |
ruby_block "store_mysql_master_status" do | |
block do | |
node.set[:mysql][:master] = true | |
m = Mysql.new("localhost", "root", node[:mysql][:server_root_password]) | |
m.query("show master status") do |row| | |
row.each_hash do |h| | |
node.set[:mysql][:master_file] = h['File'] | |
node.set[:mysql][:master_position] = h['Position'] | |
end |