Skip to content

Instantly share code, notes, and snippets.

View enderphan94's full-sized avatar
🏠
Working from home

Ender Loc Phan enderphan94

🏠
Working from home
View GitHub Profile
@enderphan94
enderphan94 / Extract_Blind_SQLi.py
Last active September 27, 2020 05:37
Burp Suite Web Academy
import requests,sys
requests.packages.urllib3.\
disable_warnings(requests.packages.urllib3.exceptions.InsecureRequestWarning)
def sql_engine(payload):
proxies = {'http':'http://127.0.0.1:8080','https':'http://127.0.0.1:8080'}

Author: enderlocphan@gmail.com

Foreword

The documents aim to recap my experience in smart contract automated testing besides the manual testing. I also put the issues that I faced during the execution, indeed, solutions are given.

Connecting with Remix from localhost

For a complex project, you can't just copy paste the single sol file and let it run. To make our life easier, Remix has localhost connection which allows you to interact with your project in your local machine remotely.

@enderphan94
enderphan94 / gethat.sh
Last active March 18, 2024 03:36
Setting up a hardhat project
#!/bin/bash
npm init --yes
npm install --save-dev hardhat
touch hardhat.config.js
npm install --save-dev @nomiclabs/hardhat-ethers ethers @nomiclabs/hardhat-waffle ethereum-waffle chai
echo "require('@nomiclabs/hardhat-waffle');" > hardhat.config.js
mkdir contracts
mkdir test
npx hardhat compile
@enderphan94
enderphan94 / grephttp
Created March 9, 2023 02:44
Extract links in web source
cat file | grep -Eo "(http|https)://[a-zA-Z0-9./?=_-]*"*
@enderphan94
enderphan94 / exLib.js
Created December 12, 2023 01:39
Extract all libraries of a website
//npm install puppeteer
const puppeteer = require('puppeteer');
(async () => {
const browser = await puppeteer.launch({ headless: "new" });
const page = await browser.newPage();
await page.goto('https://www.pikakasino.com/', { timeout: 60000 }); // 60 seconds
#!/bin/bash
# ./findCred.sh /path/to/your/directory
# Check if directory is provided
if [ -z "$1" ]; then
echo "Usage: $0 /path/to/directory"
exit 1
fi
TARGET_DIR=$1
@enderphan94
enderphan94 / CrawlMe.sh
Created June 27, 2025 14:17
One line Crawl+Scan
while read domain; do
UA=$(gshuf -n 1 -e 'Mozilla/5.0 (Windows NT 10.0; Win64; x64)' 'Mozilla/5.0 (X11; Linux x86_64)' 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7)')
katana -u "$domain" -hl -jc --no-sandbox -c 1 -p 1 -rd 3 -rl 3 -H "User-Agent: $UA" | \
httpx -silent -status-code -follow-redirects -tls-probe -random-agent -fr | \
nuclei -headless -sresp -rate-limit 10 -concurrency 10 -severity critical,high,medium \
-tags login,auth,exposure,api -markdown-export output/ -tlsi -stats
done < domains.txt