Skip to content

Instantly share code, notes, and snippets.

View joyoyoyoyoyo's full-sized avatar
🏴
living it up and busy supporting my community.

Angel Ortega (he/they) joyoyoyoyoyo

🏴
living it up and busy supporting my community.
View GitHub Profile
package sbt
package internal
package fix
import scalafix.v1._
import scala.meta._
class Sbt0_13BuildSyntax extends SyntacticRule("Sbt0_13BuildSyntax") {
override def fix(implicit doc: SyntacticDocument): Patch = {
doc.tree.collect {
@yatsu
yatsu / numpy-openblas-macos-pip.sh
Created April 5, 2020 10:31
Install numpy with enabling openblas using pip on macOS
# Setup HomeBrew: https://brew.sh/
brew install openblas
pip download --no-binary :all: --no-deps numpy
unzip numpy-1.18.2.zip # (you may have newer version)
cd numpy-1.18.2
cat > site.cfg <<EOF
[openblas]
libraries = openblas
library_dirs = $(brew --prefix openblas)/lib
@joyoyoyoyoyo
joyoyoyoyoyo / 01-deserialization_exception.md
Created July 25, 2019 04:40 — forked from tjcelaya/01-deserialization_exception.md
kryo serialization bug, based off of java-manta-examples/src/main/java/ClientEncryptionServerMultipart.java

If you run App.java with the "all" argument you should get no errors. "all-with-serialization" also works because the pointer-to-native-memory that gets serialized is still valid as long as the JVM doesn't terminate.

If you run App.java once with "initiate" and then subsequently with "complete" you will receive the following exception only if libnss is in use:

Exception in thread "main" com.esotericsoftware.kryo.KryoException: Error during Java deserialization.
Serialization trace:
sessionRef (sun.security.pkcs11.Session)
session (sun.security.pkcs11.SessionKeyRef)
sessionKeyRef (sun.security.pkcs11.P11Key$P11SecretKey)
@nuga99
nuga99 / docker-install-parrot.sh
Last active February 2, 2025 08:44
Install Docker Engine on Parrot OS (2023)
#!/bin/sh
# From https://www.hiroom2.com/2017/09/24/parrotsec-3-8-docker-engine-en/
# Changelog:
# @DavoedM: Apr 3, 2020
# @C922A10971734: Jan 19, 2023
set -e
# Install dependencies.
@btakeya
btakeya / A.java
Last active July 25, 2019 04:27
Java Serialization/Deserialization
import java.io.*;
public class A implements Serializable {
private static final long serialVersionUID = 100L;
private int value;
public A(int n) {
this.value = n;
}
#!/usr/bin/env bash
# test access without cost
aws emr create-cluster \
--name "1-node dummy cluster" \
--ec2-attributes KeyName=???,SubnetId=subnet-??? \
--instance-type m4.large \
--release-label emr-5.23.0 \
--instance-count 1 \
--use-default-roles \
@maatthc
maatthc / Glue_Spark_job_example.yml
Last active February 10, 2022 18:02
Cloud Formation example for Glue Spark Job with metrics and scheduler
AWSTemplateFormatVersion: '2010-09-09'
Description: Cloud Formation example for Glue Spark Job with metrics and scheduler
Parameters:
ArtifactBucket:
Description: A global deployable artefact bucket
Type: String
Default: artefacts
ServiceName:
Description: Service Name that owns the stack when created
@j3speaks
j3speaks / Dockerfile
Created May 17, 2019 20:28
Dockerize Spark
FROM openjdk:8-alpine
RUN apk --update add wget tar bash python
RUN wget http://apache.mirror.anlx.net/spark/spark-2.4.2/spark-2.4.2-bin-hadoop2.7.tgz
RUN tar -xzf spark-2.4.2-bin-hadoop2.7.tgz && mv spark-2.4.2-bin-hadoop2.7 spark && rm spark-2.4.2-bin-hadoop2.7.tgz
RUN printf "#!/bin/sh\n/spark/bin/spark-class org.apache.spark.deploy.master.Master --host \$SPARK_MASTER_HOST --port \$SPARK_MASTER_PORT --webui-port \$SPARK_MASTER_WEBUI_PORT" > /start-master.sh
RUN chmod +x /start-master.sh
RUN printf "#!/bin/sh\n/spark/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port \$SPARK_WORKER_WEBUI_PORT \$SPARK_MASTER_URL" > /start-worker.sh
RUN chmod +x /start-worker.sh
COPY cloudera-10k.txt /cloudera-10k.txt
COPY employee.txt /spark/examples/src/main/scala/org/apache/spark/examples/sql/employee.txt
package hadoopsters.spark.scala.monitoring.listeners
import org.apache.spark.streaming.kafka010.OffsetRange
import org.apache.spark.streaming.scheduler._
import org.joda.time.DateTime
/**
* :: ExampleStreamingListener ::
* A simple StreamingListener that accesses summary statistics across Spark Streaming batches; inherits from DeveloperAPI.
*
package tv.spotx.scala.dbutils
import java.sql.{Connection, DriverManager}
import java.util.Properties
import org.apache.commons.pool2.impl.{DefaultPooledObject, GenericObjectPool}
import org.apache.commons.pool2.{BasePooledObjectFactory, PooledObject}
import org.apache.logging.log4j.scala.Logging