Skip to content

Instantly share code, notes, and snippets.

@shbatm
Created February 28, 2020 04:33
Show Gist options
  • Save shbatm/7a6db684c79b9f9edb509e67da7bd389 to your computer and use it in GitHub Desktop.
Save shbatm/7a6db684c79b9f9edb509e67da7bd389 to your computer and use it in GitHub Desktop.
Purge Junk Unifi Clients
#!/usr/bin/env python3
import requests
from urllib.parse import urljoin
username = "REDACTED"
password = r"REDACTED"
cloud_key_ip = "192.168.1.2"
controller_port = 8443
site_name = "default"
base_url = "https://{cloud_key_ip}:{controller_port}".format(
cloud_key_ip=cloud_key_ip, controller_port=controller_port
)
# How many do you have to forget?
#
# The API call is a **POST** to `/api/s/{site}/cmd/stamgr` with the body `{"macs":["00:1e:35:ff:ff:ff"],"cmd":"forget-sta"}` yes, it does look like you could submit them all in bulk to the API but the webUI doesn't expose that
#
# To fetch the list of all devices in json **GET** `/api/s/{site}/stat/alluser`
#
# Shouldn't be that hard to throw something together in python.
def api_login(sess, base_url):
payload = {"username": username, "password": password}
url = urljoin(base_url, "/api/login")
resp = sess.post(url, json=payload, headers={"Referer": "/login"})
if resp.status_code == 200:
print("[*] successfully logged in")
return True
else:
print("[!] failed to login with provided credentials")
return False
def api_get_clients(sess, base_url, site_name):
url = urljoin(
base_url, "/api/s/{site_name}/stat/alluser".format(site_name=site_name)
)
resp = sess.get(url)
client_list = resp.json()["data"]
print("[*] retreived client list")
return client_list
def api_del_clients(sess, base_url, site_name, macs):
payload = {"cmd": "forget-sta", "macs": macs}
url = urljoin(base_url, "/api/s/{site_name}/cmd/stamgr".format(site_name=site_name))
resp = sess.post(url, json=payload)
client_list = resp.json()["data"]
print("[*] purge complete")
return client_list
def client_macs(client_list):
macs = []
for client in client_list:
if ("use_fixedip" not in client and
("tx_packets" in client and client["tx_packets"] == 0)
and ("rx_packets" in client and client["rx_packets"] == 0)
and "mac" in client
):
macs.append(client["mac"])
print("[*] {!s} clients identified for purge".format(len(macs)))
return macs
if __name__ == "__main__":
sess = requests.Session()
sess.verify = False
requests.packages.urllib3.disable_warnings()
success = api_login(sess=sess, base_url=base_url)
if success:
client_list = api_get_clients(sess=sess, base_url=base_url, site_name=site_name)
macs = client_macs(client_list=client_list)
api_del_clients(sess=sess, base_url=base_url, site_name=site_name, macs=macs)
@shbatm
Copy link
Author

shbatm commented Feb 7, 2021

Comment out the last line, and then play with the filters in lines 59 through 62.

You can see the full client list that gets downloaded by sticking the following between lines 43 and 44:

print(json.dumps(client_list, indent=2))

@sminker81
Copy link

This is the output after adding print(json.dumps(client_list, indent=2)) between lines 43&44. Thanks for taking the time to help.

stdout

[*] successfully logged in
stderr

Traceback (most recent call last):
File "/chronos/scripts/purge-clients/purge-clients.py", line 78, in
client_list = api_get_clients(sess=sess, base_url=base_url, site_name=site_name)
File "/chronos/scripts/purge-clients/purge-clients.py", line 43, in api_get_clients
print(json.dumps(client_list, indent=2))
NameError: name 'json' is not defined

@shbatm
Copy link
Author

shbatm commented Feb 7, 2021

Oh, sorry -- put import json at the top of the file too.

@sminker81
Copy link

Yeah, not picking up anything. Looks like i have to play with the filters.

stdout

[] successfully logged in
[]
[
] retreived client list
[] 0 clients identified for purge
[
] purge complete
stderr

@andy-vdg
Copy link

andy-vdg commented Sep 12, 2023

run pip3 install requests before using this script.

Those filters did not work for me as none of the records returned had fixedIP or TX/RX values in them.
I got it to work by just changing the filter to remove everything that was last seen more than 3 months ago.
Starting line 58 I changed the filter to check for the unix time stamp 3 months ago. [edit made it dynamic]

Add import datetime to top of script

def client_macs(client_list):
    macs = []
    three_months_ago = int((datetime.datetime.now() - datetime.timedelta(days=90)).timestamp())
    for client in client_list:
        if (("last_ip" not in client)
            and ("last_seen" in client and client["last_seen"] <= three_months_ago)
            and "mac" in client
        ):
            macs.append(client["mac"])
    print("[*] {!s} clients identified for purge".format(len(macs)))
    print(json.dumps(macs, indent=2))
    return macs

For testing you can add
print(json.dumps(macs, indent=2))
right after
print("[*] {!s} clients identified for purge".format(len(macs)))

and comment out the last line in order to do a dry run.
#api_del_clients(sess=sess, base_url=base_url, site_name=site_name, macs=macs)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment