Skip to content

Instantly share code, notes, and snippets.

View bepcyc's full-sized avatar
🙃
Sparkling

Viacheslav Rodionov bepcyc

🙃
Sparkling
  • Qualcomm
  • Germany
View GitHub Profile
@10c8
10c8 / default.sh
Last active September 18, 2024 19:01
Vast.ai ComfyUI Provisioning Script
#!/bin/bash
DEFAULT_WORKFLOW="https://gist.github.com/10c8/1d11361a82d63da4116437e67d4a8f7e/raw"
APT_PACKAGES=(
# "package-1"
)
PIP_PACKAGES=(
# "package-1"
@sarthakpranesh
sarthakpranesh / cleanMacVMs.sh
Last active August 15, 2024 20:34
Debloat Mac OS ( use at your own risk )
# I use MacOS VMs from github for iOS development.
# By no suprise they are a bit slow and have a lot of things I don't use
# Hence this script for lighter and better VM for my iOS development and builds
# GUI and animation related things to tweak
defaults write NSGlobalDomain NSAutomaticWindowAnimationsEnabled -bool false
defaults write NSGlobalDomain NSWindowResizeTime -float 0.001
defaults write -g QLPanelAnimationDuration -float 0
defaults write com.apple.dock autohide-time-modifier -float 0
defaults write com.apple.dock launchanim -bool false
sudo sysctl debug.lowpri_throttle_enabled=0
@jtojnar
jtojnar / gimp-nix.md
Last active April 18, 2023 04:38
How to use Nix to build fresh GIMP

Nix is a package manager that you can install in parallel with your system package manager on Linux or MacOS. It allows you to effortlesly build any of the thousands of packages defined in the nixpkgs repository or write your own package expressions. Packages in the nixpkgs repository get periodically built by our CI server so the GIMP dependencies will be obtained from binary cache saving you time.

To obtain Nix, you can simply run an installation script, or use your package manager if it is packaged in your distro repositories. Notably, there is a package in Arch’s AUR and Debian Unstable.

The pull request containing changes for the latest changes for GIMP master is NixOS/nixpkgs#67576. I usually update the branch several times a month to point to t

@anatolebeuzon
anatolebeuzon / garmin_fit_to_trainingpeaks.py
Created March 1, 2020 21:09
Batch upload your Garmin .fit files to trainingpeaks.com using this Python script. I had 5000 of them, and their customer support suggested I drag and drop each of them individually. So I did, kind of :-)
import logging
import os
from selenium import webdriver
from selenium.webdriver.firefox.options import Options
from selenium.webdriver.support import expected_conditions
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.common.by import By
from selenium.common.exceptions import NoSuchElementException, TimeoutException
from joblib import Parallel, delayed
@jlafall
jlafall / WordCountKafkaCouchbase.scala
Last active January 24, 2018 16:28
Apache Spark Structured Streaming word count using Kafka as the source and Couchbase and the sink
import org.apache.spark.sql.SparkSession
import org.apache.spark.sql.functions._
object WordCountKafkaCouchbase {
def main(args: Array[String]) {
// create spark session with settings
val spark = SparkSession
.builder
.appName("Word Count Test")
.config("spark.couchbase.username", "[username goes here]")
@polvi
polvi / README.md
Created May 3, 2017 23:53
HDFS of Kubernetes

Easiest HDFS cluster in the world with kubernetes.

Inspiration from kimoonkim/kubernetes-HDFS

kubectl create -f namenode.yaml
kubectl create -f datanode.yaml

Setup a port-forward to so you can see it is alive:

@yoyama
yoyama / Schema2CaseClass.scala
Created January 20, 2017 07:36
Generate case class from spark DataFrame/Dataset schema.
/**
* Generate Case class from DataFrame.schema
*
* val df:DataFrame = ...
*
* val s2cc = new Schema2CaseClass
* import s2cc.implicit._
*
* println(s2cc.schemaToCaseClass(df.schema, "MyClass"))
*
@ursuad
ursuad / kafka-cheat-sheet.md
Last active July 24, 2024 09:43
Quick command reference for Apache Kafka

Kafka Topics

List existing topics

bin/kafka-topics.sh --zookeeper localhost:2181 --list

Describe a topic

bin/kafka-topics.sh --zookeeper localhost:2181 --describe --topic mytopic

Purge a topic

bin/kafka-topics.sh --zookeeper localhost:2181 --alter --topic mytopic --config retention.ms=1000

... wait a minute ...

#!/usr/bin/python
import sys
import time
import socket
import struct
import logging
import binascii
import subprocess
@zoltanctoth
zoltanctoth / pyspark-udf.py
Last active July 15, 2023 13:23
Writing an UDF for withColumn in PySpark
from pyspark.sql.types import StringType
from pyspark.sql.functions import udf
maturity_udf = udf(lambda age: "adult" if age >=18 else "child", StringType())
df = spark.createDataFrame([{'name': 'Alice', 'age': 1}])
df.withColumn("maturity", maturity_udf(df.age))
df.show()