Skip to content

Instantly share code, notes, and snippets.

require 'openssl'
require 'net/http'
def cert_from_url(url)
txt = Net::HTTP.get(URI(url))
OpenSSL::X509::Certificate.new(txt)
end
LEAF_CERTIFICATE = OpenSSL::X509::Certificate.new %q[
-----BEGIN CERTIFICATE-----
# reproducer for https://github.com/jruby/jruby-openssl/issues/236
# If a certificate has two trust paths, jruby doesn't prioritize using non expired certificates, while CRuby (openssl 1.1.1+) does
# In this reproducer we have a leaf certificate with two possible chains:
# a) leaf -> intermediate cert A -> ISRG Root X1 cross-signed by (expired) DST ROOT CA X3 -> (expired) DST ROOT CA X3
# b) leaf -> intermediate cert B -> ISRG Root X1
# JRuby will produce chain a) causing an error, while CRuby produces a valid chain b)
require 'openssl'
require 'net/http'
def cert_from_url(url)
Repos related to CI
- [ ] [logstash-devutils](https://github.com/elastic/logstash-devutils)
- [ ] [docs-tools](https://github.com/elastic/docs-tools)
- [ ] [.ci](https://github.com/logstash-plugins/.ci)
- [ ] [infra](https://github.com/elastic/infra/tree/master/ci/jjb/logstash-ci) (NOTE: no branch changes needed here, only adapting CI jobs)
Repos associated with Docs
- [ ] [logstash-docs](https://github.com/elastic/logstash-docs)
@jsvd
jsvd / 0_setup_readme.txt
Last active April 20, 2021 15:14
reproduce jdbc_static issues
1. start postgres instance using docker
2. generate csv and import it to a "data" table
3. download logstash and the postgres jdbc driver
4. use logstash configuration to stress the jdbc_static filter
# Usage:
#
# ruby send_broken.rb <host> <port> <CA_file> <Cert_file> <Key_file>
# ruby send_broken.rb 127.0.0.1 5044 /Users/joaoduarte/certs/RootCA.crt /Users/joaoduarte/certs/Client-Root.crt /Users/joaoduarte/certs/Client-Root.key
#
# encoding: utf-8
require "socket"
require "zlib"
require "json"
require "openssl"
/tmp/logstash-7.10.1
❯ cat cfg
input {
generator {
codec => json
message => '{"systat": {"start": {"timestamp": {"timesecs": 10}}, "intervals": {"streams": {"end": 10}}}}'
count => 1
}
}
filter {
❯ curl -s localhost:9600/_node/stats | jq '.pipelines.main.plugins.filters[] | select(.events.in!=.events.out)'
{
  "id": "75afda0f03a5af46279c4cba9408ca87664b9c988bf477e2a2cca535e59e856f",
  "events": {
    "in": 1,
 "out": 0,
filebeat.inputs:
- type: logs
paths:
- "/var/log/test/*.log"
- type: csv
paths:
- "/data/*.csv"
package org.logstash.generated;
public final class CompiledDataset461 extends org.logstash.config.ir.compiler.BaseDataset
implements org.logstash.config.ir.compiler.Dataset {
private final java.util.ArrayList field0;
private final org.logstash.generated.CompiledDataset460 field1;
private final org.jruby.RubyArray field2;
private final org.logstash.config.ir.compiler.FilterDelegatorExt field3;
public CompiledDataset461(java.util.Map arguments) {
require_relative "lib/bootstrap/environment"
LogStash::Bundler.setup!({:without => [:build, :development]})
require "logstash-core"
require "logstash/environment"
require "logstash/config/source/local"
require "logstash/java_pipeline"
require "logstash/plugin"
java.lang.System.setProperty("ls.logs", "logs")
java.lang.System.setProperty("ls.log.format", "plain")