Skip to content

Instantly share code, notes, and snippets.

View CoolOppo's full-sized avatar
🇺🇲

Max Azoury CoolOppo

🇺🇲
View GitHub Profile
@hanxiao
hanxiao / testRegex.js
Last active March 21, 2025 06:53
Regex for chunking by using all semantic cues
// Updated: Aug. 20, 2024
// Run: node testRegex.js whatever.txt
// Live demo: https://jina.ai/tokenizer
// LICENSE: Apache-2.0 (https://www.apache.org/licenses/LICENSE-2.0)
// COPYRIGHT: Jina AI
const fs = require('fs');
const util = require('util');
// Define variables for magic numbers
const MAX_HEADING_LENGTH = 7;
@disler
disler / README_MINIMAL_PROMPT_CHAINABLE.md
Last active March 31, 2025 17:44
Minimal Prompt Chainables - Zero LLM Library Sequential Prompt Chaining & Prompt Fusion

Minimal Prompt Chainables

Sequential prompt chaining in one method with context and output back-referencing.

Files

  • main.py - start here - full example using MinimalChainable from chain.py to build a sequential prompt chain
  • chain.py - contains zero library minimal prompt chain class
  • chain_test.py - tests for chain.py, you can ignore this
  • requirements.py - python requirements

Setup

@0xdevalias
0xdevalias / _deobfuscating-unminifying-obfuscated-web-app-code.md
Last active March 30, 2025 06:51
Some notes and tools for reverse engineering / deobfuscating / unminifying obfuscated web app code
import time
import os
import logging
import random
from datasets import load_dataset
class QuantAutoGPTQ:
def __init__(self, model_name_or_path, output_dir, dataset,
num_samples=128, trust_remote_code=False, cache_examples=True,
use_fast=True, use_triton=False, bits=[4], group_size=[128], damp=[0.01],
# coding=utf-8
# Copyright 2023 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
from transformers import AutoModelForCausalLM, AutoTokenizer
from peft import PeftModel
import torch
import os
import argparse
def get_args():
parser = argparse.ArgumentParser()
parser.add_argument("--base_model_name_or_path", type=str)
➜ babyagi git:(main) python3.9 babyagi.py
*****OBJECTIVE*****
I have been given no context other than I am an agent existing in a universe. Why am I here? I do not know. I must discover my own objective.
Initial task: Begin
*****TASK LIST*****
@YuRaNnNzZZ
YuRaNnNzZZ / TFA Base Documentation.md
Last active January 7, 2025 04:04
TFA Base Documentation (obsolete)
@RubenKelevra
RubenKelevra / fast_firefox.md
Last active April 3, 2025 02:02
Make Firefox fast again
@shobhitic
shobhitic / MyToken.sol
Created March 26, 2022 11:24
Merkletree Allowlist / Whitelist for NFT
// SPDX-License-Identifier: MIT
pragma solidity ^0.8.4;
import "@openzeppelin/[email protected]/token/ERC721/ERC721.sol";
import "@openzeppelin/[email protected]/access/Ownable.sol";
import "@openzeppelin/[email protected]/utils/Counters.sol";
import "@openzeppelin/[email protected]/utils/cryptography/MerkleProof.sol";
contract MyToken is ERC721, Ownable {
using Counters for Counters.Counter;