Skip to content

Instantly share code, notes, and snippets.

View jasonnerothin's full-sized avatar
🎯
Focusing

Jason Nerothin jasonnerothin

🎯
Focusing
  • Hands Dirty Engineering
  • At large
View GitHub Profile
import logging
import threading
import os
import time # <-- NEW: Needed for tracking metric freshness
from typing import Dict, Any, List
# --- OpenTelemetry Imports ---
from opentelemetry.metrics import set_meter_provider, get_meter_provider, CallbackOptions, Observation
from opentelemetry.sdk.metrics import MeterProvider
from opentelemetry.sdk.metrics.export import PeriodicExportingMetricReader
@jasonnerothin
jasonnerothin / otlp_logger.py
Last active October 31, 2025 03:42
OLTP logging exporter
import logging
import os
import time
from opentelemetry._logs import set_logger_provider
from opentelemetry.sdk._logs import LoggerProvider, LoggingHandler
from opentelemetry.sdk._logs.export import BatchLogRecordProcessor
from opentelemetry.sdk.resources import Resource
from opentelemetry.exporter.otlp.proto.http._log_exporter import OTLPLogExporter
@jasonnerothin
jasonnerothin / arch.java
Created December 17, 2020 05:00
OS & Architecture
class arch{
public static void main(String[] args) throws Exception{
String os = System.getProperty("os.name").toLowerCase().replace(" ", "_");
String arch = System.getProperty("os.arch").toLowerCase().replace(" ", "_");
System.out.println("OS: " + os + " arch: " + arch );
}}
DECLARE INT month_ago; DECLARE INT streams_cutover_date;
SET month_ago = 200; SET streams_cutover_date = 2000;
-- ~~~ historical ~~~ --
INGEST * FROM source_database
WHERE
t < month_ago ;
-- ~~~ incremental ~~~ --
INGEST * FROM source_database
@jasonnerothin
jasonnerothin / ReadWriteParquet.py
Created September 24, 2019 04:14
Read parquet, simple transformations, write parquet using spark sql
from pyspark import SparkContext
from pyspark.sql import SparkSession
from pyspark.sql.functions import lit
from random import randint, seed
input_file = '/tmp/input.snappy.parquet'
output_file = '/tmp/output.snappy.parquet'
spark = SparkSession(SparkContext('local', 'make-recs-application'))
@jasonnerothin
jasonnerothin / pyproject.toml
Created August 21, 2019 20:27
Airflow with poetry
[tool.poetry]
name = "legion-airflow"
version = "0.1.0"
description = "Airflow Operators for Legion"
license = "Apache2"
authors = [
"[email protected] <[email protected]>"
]
@jasonnerothin
jasonnerothin / SparkActions.md
Last active April 21, 2019 22:34
Spark Action Summary

Spark Actions (as of 2.4.1)

There are three kinds of actions:

  • view data in the console
  • collect data to language-native objects
  • write to output data sources

.aggregate(zeroValue)(seqOp, combOp)
@jasonnerothin
jasonnerothin / install-bosh-manager.sh
Created April 29, 2018 14:57
bosh / cf install script
#!/bin/bash -x
readonly HERE=$(pwd)
readonly BASEDIR=~/scratch
readonly BOSHINSTALLNAME='bosh0'
# SETUP
readonly BOSHDIR="${BASEDIR}/${BOSHINSTALLNAME}"
echo "Setting up deployment in ${BOSHDIR}..."
// ...And we will take two more readings.
int on_1 = analogRead(photoresistor); // read photoresistor
delay(200); // wait 200 milliseconds
int on_2 = analogRead(photoresistor); // read photoresistor
delay(300); // wait 300 milliseconds
intactValue = (on_1+on_2)/2;
brokenValue = (off_1+off_2)/2;
beamThreshold = (intactValue+brokenValue)/2;
// constants
const int greenLEDPin = 9;
const int redLEDPin = 11;
const int blueLEDPin = 10;
const int redSensorPin = A0;
const int greenSensorPin = A0;
const int blueSensorPin = A0;