Skip to content

Instantly share code, notes, and snippets.

Latency Comparison Numbers
--------------------------
L1 cache reference 0.5 ns
Branch mispredict 5 ns
L2 cache reference 7 ns 14x L1 cache
Mutex lock/unlock 25 ns
Main memory reference 100 ns 20x L2 cache, 200x L1 cache
Compress 1K bytes with Zippy 3,000 ns
Send 1K bytes over 1 Gbps network 10,000 ns 0.01 ms
Read 4K randomly from SSD* 150,000 ns 0.15 ms
@strarsis
strarsis / logstash.conf
Created December 7, 2016 22:59 — forked from TinLe/logstash.conf
my logstash.conf file for postfix
input {
file {
path => "/var/log/maillog*"
exclude => "*.gz"
start_position => "beginning"
type => "maillog"
}
}
filter {
if [type] == "maillog" {
@strarsis
strarsis / maillog.json
Created December 7, 2016 22:59 — forked from TinLe/maillog.json
maillog elasticsearch mapping template
{
"template" : "maillog-*",
"order" : 1,
"settings" : {
"number_of_shards" : 2,
"index.refresh_interval" : "90s"
},
"mappings" : {
"maillog" : {
"properties" : {
@strarsis
strarsis / gist:9678545f91d76a3a7afa307c7389435b
Created December 5, 2016 23:01 — forked from mjpowersjr/gist:740a9583e9ec8b49e0a3
Parsing the MySQL slow query log via Logstash (the easy way?)

The MySQL slow query log is a difficult format to extract information from. After looking at various examples with mixed results, I realized that it's much easier to configure MySQL to write the slow query log to a table in CSV format!

From the MySQL documentation:

By default, the log tables use the CSV storage engine that writes data in comma-separated values format. For users who have access to the .CSV files that contain log table data, the files are easy to import into other programs such as spreadsheets that can process CSV input.

my.cnf

Note: don't forget to open up permissions on your slow query log CSV file so logstash can read it!

# enable slow query log