Skip to content

Instantly share code, notes, and snippets.

@cddr
Last active September 21, 2016 09:59
Show Gist options
  • Save cddr/fc3190ea6de60ac4750b7b17cc4c73d6 to your computer and use it in GitHub Desktop.
Save cddr/fc3190ea6de60ac4750b7b17cc4c73d6 to your computer and use it in GitHub Desktop.
#!/bin/bash
#
# Here's the scenario
#
# You can ssh into some shared environment containing kafka but you need to
# go through a bastion server to get there. How can you easily stream kafka
# data into your local system (e.g. to get test data, gather data from shared
# environments to reproduce a bug etc)
#
# This script can be executed on the bastion. It picks a random mesosslave to
# ssh into and run the kafka-avro-console-consumer command on it and emits
# the data to your console. The script may live locally. You can execute it
# on the bastion using something like
#
# `ssh foo 'bash -s topic' < ~/bin/kafka-data`
#
# What that does is execute this script on foo, passing "topic" as the first
# argument. If topic is a kafka avro topic accessable from the randomly
# selected mesosslave, it will be send to the STDOUT of the calling terminal
SCHEMA_REGISTRY_URL=http://schema-registry.localservice
ZOOKEEPER_SERVICE=zookeeper.service.consul
function kafka-avro-consume () {
echo kafka-avro-console-consumer \
--property schema.registry.url=$SCHEMA_REGISTRY_URL \
--zookeeper $ZOOKEEPER_SERVICE \
--topic $1 \
--from-beginning
}
function random-node () {
consul members |grep mesosslave |head -n 1 |awk '{ print $1 }'
}
ssh $(random-node) $(kafka-avro-consume $1)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment