Skip to content

Instantly share code, notes, and snippets.

View timvw's full-sized avatar

Tim Van Wassenhove timvw

View GitHub Profile
@myusuf3
myusuf3 / delete_git_submodule.md
Created November 3, 2014 17:36
How effectively delete a git submodule.

To remove a submodule you need to:

  • Delete the relevant section from the .gitmodules file.
  • Stage the .gitmodules changes git add .gitmodules
  • Delete the relevant section from .git/config.
  • Run git rm --cached path_to_submodule (no trailing slash).
  • Run rm -rf .git/modules/path_to_submodule (no trailing slash).
  • Commit git commit -m "Removed submodule "
  • Delete the now untracked submodule files rm -rf path_to_submodule
(*
CapabilityBasedSecurity_ConfigExample.fsx
An example of a simple capability-based design.
Related blog post: http://fsharpforfunandprofit.com/posts/capability-based-security/
*)
/// Configuration system
module Config =
module Person =
open System
type PersonState = private { id: Guid; name: string; age: int}
let createPerson id name age = {id = id; name = name; age = age}
let changeName name personState = {personState with name = name}
let changeAge age personState =
// some crazy business rule involving age
{personState with age = age}
module SomeOtherModule =
@ipropper
ipropper / client.java
Last active October 18, 2022 05:19
Simple Mutual Authentication Client with SSL Sockets (JAVA)
//CLIENT
// this method is quick and dirty, but should get the service up and running
import javax.net.SocketFactory;
import javax.net.ssl.SSLSocket;
import javax.net.ssl.SSLSocketFactory;
import java.io.IOException;
import java.io.OutputStreamWriter;
@tobilg
tobilg / custom_s3_endpoint_in_spark.md
Last active July 31, 2024 10:22
Description on how to use a custom S3 endpoint (like Rados Gateway for Ceph)

Custom S3 endpoints with Spark

To be able to use custom endpoints with the latest Spark distribution, one needs to add an external package (hadoop-aws). Then, custum endpoints can be configured according to docs.

Use the hadoop-aws package

bin/spark-shell --packages org.apache.hadoop:hadoop-aws:2.7.2

SparkContext configuration

@bastman
bastman / docker-cleanup-resources.md
Created March 31, 2016 05:55
docker cleanup guide: containers, images, volumes, networks

Docker - How to cleanup (unused) resources

Once in a while, you may need to cleanup resources (containers, volumes, images, networks) ...

delete volumes

// see: https://github.com/chadoe/docker-cleanup-volumes

$ docker volume rm $(docker volume ls -qf dangling=true)

$ docker volume ls -qf dangling=true | xargs -r docker volume rm

@dcode
dcode / GitHub Flavored Asciidoc (GFA).adoc
Last active May 26, 2025 06:38
Demo of some useful tips for using Asciidoc on GitHub

GitHub Flavored Asciidoc (GFA)

@walm
walm / main.go
Last active November 5, 2024 16:22
Simple Golang DNS Server
package main
import (
"fmt"
"log"
"strconv"
"github.com/miekg/dns"
)
@jpallari
jpallari / article.org
Last active November 9, 2022 18:46
Enforcing invariants in Scala datatypes

Enforcing invariants in Scala datatypes

Scala provides many tools to help us build programs with less runtime errors. Instead of relying on nulls, the recommended practice is to use the Option type. Instead of throwing exceptions, Try and Either types are used for representing potential error scenarios. What’s common with these features is that they’re used for capturing runtime features in the type system, thus lifting the runtime scenario handling to the compilation phase: your program doesn’t compile until you’ve explicitly handled nulls, exceptions, and other runtime features in your code.

In his “Strategic Scala Style” blog post series,

@yoyama
yoyama / Schema2CaseClass.scala
Created January 20, 2017 07:36
Generate case class from spark DataFrame/Dataset schema.
/**
* Generate Case class from DataFrame.schema
*
* val df:DataFrame = ...
*
* val s2cc = new Schema2CaseClass
* import s2cc.implicit._
*
* println(s2cc.schemaToCaseClass(df.schema, "MyClass"))
*