This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
$ eval $(docker-machine env --swarm mhs-demo0-do) | |
$ docker info | |
Containers: 3 | |
Images: 2 | |
Role: primary | |
Strategy: spread | |
Filters: health, port, dependency, affinity, constraint | |
Nodes: 2 | |
mhs-demo0-do: 188.166.113.176:2376 | |
└ Containers: 2 |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
docker run --name iperf_host -d -ti --net host \ | |
--env="constraint:node==mhs-demo0-do" \ | |
mustafaakin/alpine-iperf iperf -s | |
# Wait a little | |
sleep 1 | |
for run in {1..5}; do | |
docker run --net host --env="constraint:node==mhs-demo1-do" -ti \ | |
mustafaakin/alpine-iperf \ |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
docker network create -d overlay mynet | |
docker run --name iperf_overlay -d -ti --net mynet \ | |
--env="constraint:node==mhs-demo0-do" \ | |
mustafaakin/alpine-iperf iperf -s | |
# Wait a little | |
sleep 1 | |
IP=$(docker inspect -f "{ { .NetworkSettings.Networks.mynet.IPAddress }}" iperf_overlay) | |
for run in {1..5}; do |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import org.apache.flink.api.common.functions.MapFunction; | |
import org.apache.flink.api.java.tuple.Tuple3; | |
import org.apache.flink.streaming.api.TimeCharacteristic; | |
import org.apache.flink.streaming.api.datastream.DataStream; | |
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment; | |
import org.apache.flink.streaming.api.functions.timestamps.AscendingTimestampExtractor; | |
import org.apache.flink.table.api.Table; | |
import org.apache.flink.table.api.TableEnvironment; | |
import org.apache.flink.table.api.java.StreamTableEnvironment; | |
import org.apache.flink.types.Row; |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
SELECT | |
room, | |
TUMBLE_END(rowtime, INTERVAL ‘10’ SECOND), | |
AVG(temperature) AS avgTemp | |
FROM sensors | |
GROUP BY | |
TUMBLE(rowtime, INTERVAL ‘10’ SECOND), | |
room |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
// A data stream, in reality it would be Kafka, Kinesis or other streams | |
DataStream<String> text = env.socketTextStream("localhost", port, "\n"); | |
// Format the streaming data into a row format | |
DataStream<Tuple3<String, Double, Time>> dataset = text | |
.map(mapFunction) | |
.assignTimestampsAndWatermarks(extractor); | |
// Register it so we can refer it as 'sensors' in SQL | |
tableEnv.registerDataStream("sensors", dataset, "room, temperature, creationDate, rowtime.rowtime"); |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
hyperkit -A -m 512M -s 0:0,hostbridge \ | |
-s 31,lpc \ | |
-l com1,stdio \ | |
-s 1:0,ahci-hd,file://$(pwd)/$QCOW2,format=qcow \ | |
-s 5,ahci-cd,$(pwd)/seed.img \ | |
-f kexec,$KERNEL,$INITRD,$CMDLIN | |
mirage_block_open: block_config = file:///Users/mustafa/Downloads/xenial-server-cloudimg-amd64-disk1.qcow2 and qcow_config = None and stats_config = None | |
hyperkit: [INFO] Resized file to 32768 clusters (4194304 sectors) | |
hyperkit: [INFO] image has 0 free sectors and 32764 used sectors | |
mirage_block_open: block_config = file:///Users/mustafa/Downloads/xenial-server-cloudimg-amd64-disk1.qcow2 and qcow_config = None and stats_config = None returning 0 |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
<dependency> | |
<groupId>org.apache.parquet</groupId> | |
<artifactId>parquet-common</artifactId> | |
<version>1.9.0</version> | |
</dependency> | |
<dependency> | |
<groupId>org.apache.parquet</groupId> | |
<artifactId>parquet-avro</artifactId> | |
<version>1.9.0</version> |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Configuration conf = new Configuration(); | |
Path p = new Path("data.parquet"); // This is not java.nio.file.Path | |
ParquetWriter<FlowLogs> writer = AvroParquetWriter.<FlowLog>builder(p) | |
.withSchema(ReflectData.AllowNull.get().getSchema(FlowLog.class)) | |
.withDataModel(ReflectData.get()) | |
.withConf(conf) | |
.withCompressionCodec(CompressionCodecName.SNAPPY) | |
.withWriteMode(Mode.OVERWRITE) | |
.build(); |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
SchemaBuilder.RecordBuilder<Schema> builder = SchemaBuilder | |
.record("flow_logs").namespace("com.opsgenie"); | |
schema = builder.fields() | |
.name("version").type().intType().intDefault(0) | |
.name("account").type().stringType().stringDefault("") | |
.name("interfaceId").type().stringType().stringDefault("") | |
.name("sourceAddress").type().stringType().stringDefault("") | |
// continues for remaining 10 fields.. |