Skip to content

Instantly share code, notes, and snippets.

View juanmc2005's full-sized avatar
🎯
Focusing

Juan Coria juanmc2005

🎯
Focusing
View GitHub Profile
@juanmc2005
juanmc2005 / diart_whisper.py
Last active October 29, 2024 23:55
Code for my tutorial "Color Your Captions: Streamlining Live Transcriptions with Diart and OpenAI's Whisper". Available at https://medium.com/@juanmc2005/color-your-captions-streamlining-live-transcriptions-with-diart-and-openais-whisper-6203350234ef
import logging
import os
import sys
import traceback
from contextlib import contextmanager
import diart.operators as dops
import numpy as np
import rich
import rx.operators as ops
@juanmc2005
juanmc2005 / jobarray_template.slurm
Last active April 26, 2022 13:06
Template scripts to run sbatch jobs
#!/bin/bash
#SBATCH --job-name=<some_name>
#SBATCH --account=<some_account> # optional, count job hours from this account
#SBATCH --ntasks=1 # number of tasks (a single process here)
#SBATCH --gres=gpu:1 # number of GPUs (a single GPU here)
#SBATCH --cpus-per-task=4 # number of cores, mostly for data loader workers
#SBATCH --hint=nomultithread # optional, restrict to physical cores and not logical ones
#SBATCH --time=20:00:00 # maximum execution time (HH:MM:SS)
#SBATCH --output=logs_%A_%a.out # output file, %A is the id of the array job and %a is the number of the job in the array
#SBATCH --error=logs_%A_%a.out # error file (same as output is fine)
@juanmc2005
juanmc2005 / sync.sh
Last active November 19, 2020 16:29
Project synchronization with remote server using rsync with exponential backoff. Just place the script at the root of the project and run.
#!/bin/bash
# Sync project with remote server using rsync with exponential backoff.
# Just copy it to the root of the project and make sure rsync is installed before running.
#
# Why I use this script:
# - sshfs doesn't play well with IntelliJ's file sync (IDE freezes quite often)
# - pushing silly commits for debugging or running experiments is not a good practice
#
# I built this script to reduce my network traffic and be more efficient in my workflow.
@juanmc2005
juanmc2005 / LICENSE
Last active November 19, 2020 16:31
This license applies to all my public gists unless stated otherwise
MIT License
Copyright (c) 2020 Juan Manuel Coria
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
@juanmc2005
juanmc2005 / model.py
Last active October 3, 2022 15:27
PLDA scoring using Pyannote (https://github.com/pyannote/pyannote-audio) and a customized version of PLDA (https://github.com/RaviSoji/plda) to include some specific features like length normalization and latent space dimension tuning
# Copyright 2017 Ravi Sojitra. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
@juanmc2005
juanmc2005 / NetworkObservable.java
Created June 14, 2017 19:43
Toy RxJava2 Observable providing some utilities for networking use cases
package com.example.juancoria.rxjavaplayground;
import java.util.concurrent.Callable;
import java.util.concurrent.TimeUnit;
import io.reactivex.Observable;
import io.reactivex.Single;
import io.reactivex.SingleObserver;
import io.reactivex.SingleSource;
import io.reactivex.android.schedulers.AndroidSchedulers;
public interface Action {
void run();
}
module NeuralNet (newFeedForward, predict, train, layers, activation, loss) where
-- A Feed Forward Neural Network implementation
import Data.Matrix
import System.Random
type Layer = Matrix Double
type Delta = Matrix Double
@juanmc2005
juanmc2005 / perceptron.hs
Last active December 24, 2016 02:53
Perceptron in Haskell
module Perceptron (newPerceptron, predict, train, weights, activation) where
-- A Perceptron implementation
type Weights = [Float]
type Inputs = [Float]
type LearningRate = Float
type Epochs = Int
type ErrorValue = Float