Skip to content

Instantly share code, notes, and snippets.

View kares's full-sized avatar

Karol Bucek kares

View GitHub Profile
@yaauie
yaauie / replace-with-serialzied.logstash-filter-ruby.rb
Last active April 26, 2022 06:56
Logstash Ruby Filter script to replace a structured event's data with a single field containing a JSON-serialized string representing the same data.
###############################################################################
# replace-with-serialzied.logstash-filter-ruby.rb
# ---------------------------------
# A script for a Logstash Ruby Filter to replace the event's contents with a
# single field containing a string JSON-encoded representation of the event.
#
# This filter _MUTATES_ the event, removing all DATA-keys while leaving METADATA
# in-tact.
#
###############################################################################

The included apply-template.rb provides a way generate Logstash config fragments from a shared template.

This can be useful for shared verbose configuration that is shared across multiple pipelines.

For example, if we are using multiple pipelines with pipelines.yml

 - pipeline.id: one
   path.config: "${LOGSTASH_HOME}/pipelines/one/*.conf"
 - pipeline.id: two
await new Promise(function (resolve) {
setTimeout(function () {
resolve();
}, 1000);
});
// ... Can be shortened to:
await new Promise(function (resolve) {
setTimeout(resolve, 1000);
# Usage:
#
# ruby send_broken.rb <host> <port> <CA_file> <Cert_file> <Key_file>
# ruby send_broken.rb 127.0.0.1 5044 /Users/joaoduarte/certs/RootCA.crt /Users/joaoduarte/certs/Client-Root.crt /Users/joaoduarte/certs/Client-Root.key
#
# encoding: utf-8
require "socket"
require "zlib"
require "json"
require "openssl"
@dhh
dhh / Gemfile
Created June 24, 2020 22:23
HEY's Gemfile
ruby '2.7.1'
gem 'rails', github: 'rails/rails'
gem 'tzinfo-data', '>= 1.2016.7' # Don't rely on OSX/Linux timezone data
# Action Text
gem 'actiontext', github: 'basecamp/actiontext', ref: 'okra'
gem 'okra', github: 'basecamp/okra'
# Drivers
require_relative "lib/bootstrap/environment"
LogStash::Bundler.setup!({:without => [:build, :development]})
require "logstash-core"
require "logstash/environment"
require "logstash/config/source/local"
require "logstash/java_pipeline"
require "logstash/plugin"
java.lang.System.setProperty("ls.logs", "logs")
java.lang.System.setProperty("ls.log.format", "plain")
@jaflo
jaflo / description.md
Created May 6, 2020 03:51
Export Instapaper to HTML & PDF

Use this to automatically scrape all of your saved Instapaper articles locally as HTML and PDF files. I originally wrote this to read my saved documents on my reMarkable tablet. Instapaper does not have an option to export all my stuff as PDF as far as I could tell (the built-in options only export a subset).

You will need to have the following packages installed:

Configure your username and password, then run the script. It will go through all articles shown on your home page and download the copy Instapaper has stored into a folder called output as HTML file and convert it into a PDF. You can customize the look by updating the included styles.css file. Any errors will be reported and logged to failed.txt. Errors might be due to parsing errors on Instapaper's side or due to PDF conversion issues.

@MioOgbeni
MioOgbeni / configmap-pipelines.yml
Last active July 7, 2020 14:11
configmap-pipelines
apiVersion: v1
kind: ConfigMap
metadata:
name: logstash-pipelines
labels:
app: logstash
data:
event-hub-input.conf: |-
input {
azure_event_hubs {
/tmp/logstash-7.6.0 ❯ bin/logstash -i irb
Sending Logstash logs to /tmp/logstash-7.6.0/logs which is now configured via log4j2.properties
irb(main):001:0> grok_class = LogStash::Plugin.lookup("filter", "grok")
=> LogStash::Filters::Grok
irb(main):002:0> grok = grok_class.new("match" => { "message" => [ "%{WORD:word}", "%{NUMBER:num}" ] })
=> <LogStash::Filters::Grok match=>{"message"=>["%{WORD:word}", "%{NUMBER:num}"]}, id=>"grok_1a49e57c-96f1-4381-b421-0fb93adf6eec", enable_metric=>true, periodic_flush=>false, patterns_files_glob=>"*", break_on_match=>true, named_captures_only=>true, keep_empty_captures=>false, tag_on_failure=>["_grokparsefailure"], timeout_millis=>30000, timeout_scope=>"pattern", tag_on_timeout=>"_groktimeout">
irb(main):003:0> grok.register
# this terminates quickly in mri but hangs in jruby
def regex_dos
"foo========:bar baz================================================bingo".scan(/(?:=+=+)+:/)
end
t = Thread.new { regex_dos(); puts "done" }
puts t.status
t.kill
puts t.status
t.join