This is inspired by A half-hour to learn Rust and Zig in 30 minutes.
Your first Go program as a classical "Hello World" is pretty simple:
First we create a workspace for our project:
This is inspired by A half-hour to learn Rust and Zig in 30 minutes.
Your first Go program as a classical "Hello World" is pretty simple:
First we create a workspace for our project:
| ########################################################### | |
| # How to NEVER use lambdas. An inneficient and yet educa- # | |
| # tonal [sic] guide to the proper misuse of the lambda # | |
| # construct in Python 3.x. [DO NOT USE ANY OF THIS EVER] # | |
| # original by (and apologies to): e000 (13/6/11) # | |
| # now in Python 3 courtesy of: khuxkm (17/9/20) # | |
| ########################################################### | |
| ## Part 1. Basic LAMBDA Introduction ## | |
| # If you're reading this, you've probably already read e000's |
| import itertools | |
| from multiprocessing import Process, cpu_count | |
| from multiprocessing import Pool | |
| from multiprocessing.pool import ThreadPool | |
| # | |
| # CONFIG | |
| # | |
| MAX_POOL_PROCESSES=cpu_count()-1 |
| ;; In response to blog post: | |
| ;; https://medium.com/@kasperpeulen/10-features-from-various-modern-languages-that-i-would-like-to-see-in-any-programming-language-f2a4a8ee6727 | |
| ;; Run with lumo | |
| ;; https://github.com/anmonteiro/lumo | |
| ;; | |
| ;; # npm install -g lumo-cljs | |
| ;; lumo clojurescript-feature-examples.cljs |
Kafka 0.11.0.0 (Confluent 3.3.0) added support to manipulate offsets for a consumer group via cli kafka-consumer-groups command.
kafka-consumer-groups --bootstrap-server <kafkahost:port> --group <group_id> --describeNote the values under "CURRENT-OFFSET" and "LOG-END-OFFSET". "CURRENT-OFFSET" is the offset where this consumer group is currently at in each of the partitions.
| package main | |
| import ( | |
| "crypto/rand" | |
| "fmt" | |
| "math/big" | |
| "os" | |
| "sync" | |
| "time" | |
| ) |
| library(plyr); library(dplyr); library(leaflet); library(stringi); | |
| library(htmltools); library(RColorBrewer); library(rvest) | |
| # Parse and read storm track data. | |
| html <- read_html('http://weather.unisys.com/hurricane/atlantic/2016/index.php') | |
| links <- html_attr(html_nodes(html, "a"), "href") | |
| links <- links[grep('track.dat', links)] | |
| track <- select.list(links, title="Select storm:", graphics = FALSE) | |
| #track <- "MATTHEW/track.dat" |
| import java.io.InputStream | |
| import org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils | |
| import org.apache.spark.sql.{ DataFrame, Row } | |
| import org.postgresql.copy.CopyManager | |
| import org.postgresql.core.BaseConnection | |
| val jdbcUrl = s"jdbc:postgresql://..." // db credentials elided | |
| val connectionProperties = { |
| #!/bin/bash | |
| # unfortunately debian currently panics in xhyve | |
| tmp=$(mktemp -d) | |
| pushd "$tmp" | |
| iso="$HOME"/Downloads/debian-8.1.0-amd64-netinst.iso | |
| #iso="$HOME"/Downloads/debian-8.1.0-i386-netinst.iso | |
| echo "fixing disk" | |
| dd if=/dev/zero bs=2k count=1 of=tmp.iso |