Skip to content

Instantly share code, notes, and snippets.

View 8bit-pixies's full-sized avatar

8bit-pixies

  • Nowhere
View GitHub Profile
@8bit-pixies
8bit-pixies / index.md
Last active September 29, 2024 01:23
Interesting Open Source (for Commercial Use) Generative AI models

Speech to Text

The original Whisper model is a good speech to text transcription model which is used in many places: https://huggingface.co/openai/whisper-large-v3

Text to Speech

WhisperSpeech is a good text to speech model with voice cloning: https://huggingface.co/WhisperSpeech/WhisperSpeech that uses an MIT license (unlike coqui and suno). It isn't the "best" model, but for its size it is very, very good.

Other alternatives is a purely ONNX driven model as sponsored by txtai: https://huggingface.co/NeuML/ljspeech-jets-onnx

write a glowing review for my real estate agent who sold my apartment and provided attention to our particular requests and wants

Subject: A Stellar Real Estate Experience with [Agent's Name]

I recently had the pleasure of working with [Agent's Name] from [Real Estate Agency] to sell my apartment, and I cannot express enough how impressed I am with the level of service and professionalism I received throughout the entire process.

From our initial consultation to the successful closing, [Agent's Name] demonstrated a deep understanding of the real estate market and a genuine commitment to helping me achieve the best possible outcome. Their expertise and guidance were invaluable, making what could have been a stressful experience remarkably smooth and efficient.

One of the standout qualities of [Agent's Name] was their excellent communication skills. They kept me informed every step of the way, promptly responding to my inquiries and providing regular updates on the progress of the sale. This level of tran

@8bit-pixies
8bit-pixies / monokai.colorscheme
Last active December 5, 2021 01:06
Monokai colorscheme for qterminal or konsole
[Background]
Color=39,40,34
[BackgroundIntense]
Color=65,67,57
[Foreground]
Color=225,225,218
[ForegroundIntense]
@8bit-pixies
8bit-pixies / river_to_shap.py
Last active January 22, 2021 04:57
This is to demonstrate how we could naively convert a tree in River to work with the Shap library. This is in order to start a discussion. https://github.com/online-ml/river/issues/437
# The goal of this is to try to make use of Shap to explain a tree built in river.
# https://github.com/online-ml/river/issues/437
from functools import reduce
import operator
import numpy as np
import pandas as pd
import pprint
from sklearn import datasets
#!/usr/bin/env python
"""
Simple example of a full screen application with a vertical split.
This will show a window on the left for user input. When the user types, the
reversed input is shown on the right. Pressing Ctrl-Q will quit the application.
"""
from prompt_toolkit.application import Application
from prompt_toolkit.buffer import Buffer
from prompt_toolkit.key_binding import KeyBindings
@8bit-pixies
8bit-pixies / model_drift.py
Created November 17, 2019 21:04
model drift using sm models brief example
import pandas as pd
import statsmodels.api as sm
def test_rolling_mode():
"""
The goal of this application level monitoring is to react in light of changes
to re-model fits and movement in the modelling process. In the linear model
scenario, confidence intervals around the coefficients can be provided which
can be used as a measure for model drift over time.
@8bit-pixies
8bit-pixies / simplegrucell.py
Created November 10, 2019 20:50
This is an implementation of grucell in Keras. This shoudl allow for a bit more flexibility when not working under the "recurrent" framework
"""
This is a manual implementaiton of grucell so that it will work in more
general envrionments...
"""
import tensorflow as tf
input_size = 64
cell_size = 32
<script src='https://unpkg.com/[email protected]/dist/tesseract.min.js'></script>
<input type="file" onchange="getRego(this.files)">
<script>
function getRego(files) {
const worker = new Tesseract.TesseractWorker();
worker.recognize(files[0]).then(function(data){
// construct something which glues all the words together...
console.log(data.text);
})
import numpy as np
def linear_part(x, w):
return x * w
def non_linear(x, p=0.001):
multi = (x > 0).astype(np.float64)
multi[multi == 0] = 0.001
return x * multi
pareto_principle <- function(x){
return (x ^ (log(0.2)/log(0.8)))
}
sprintf("%.2f of the effort comes from %.2f of the causes", 0.99, pareto_principle(0.99))
sprintf("%.2f of the effort comes from %.2f of the causes", 0.95, pareto_principle(0.95))
sprintf("%.2f of the effort comes from %.2f of the causes", 0.8, pareto_principle(0.8))
sprintf("%.2f of the effort comes from %.2f of the causes", 0.5, pareto_principle(0.5))