I hereby claim:
- I am sriganesh on github.
- I am sriganesh (https://keybase.io/sriganesh) on keybase.
- I have a public key ASDCuOOsBJ7-VhGbNyvDeH1ABwb1_4jLHhCaVZQkGLiRLwo
To claim this, I am signing this object:
RestTemplate restTemplate = new RestTemplate(); | |
try { | |
ResponseEntity<List<Claim>> claimResponse = restTemplate.exchange( | |
uri, | |
HttpMethod.GET, | |
null, | |
new ParameterizedTypeReference<List<Claim>>() {}); | |
if(claimResponse != null && claimResponse.hasBody()){ | |
claims = claimResponse.getBody(); |
Rank | Type | Prefix/Suffix | |
---|---|---|---|
1. | Prefix | my+ | |
2. | Suffix | +online | |
3. | Prefix | the+ | |
4. | Suffix | +web | |
5. | Suffix | +media | |
6. | Prefix | web+ | |
7. | Suffix | +world | |
8. | Suffix | +net | |
9. | Prefix | go+ |
curl ipaddress.sh |
cd /var/lib/docker/volumes/global-nginx-proxy_certs/_data | |
mkdir tmp | |
mv * tmp/ | |
cd ~ | |
ee site ssl-renew example.com --force | |
Delete the old certs from /var/lib/docker/volumes/global-nginx-proxy_certs/_data and run the ssl renew site update command | |
to try the renewal of expired cert. | |
https://github.com/EasyEngine/easyengine/issues/1429 |
AA AB AC AD AE AF AG AH AI AJ AK AL AM AN AO AP AQ AR AS AT AU AV AW AX AY AZ | |
BA BB BC BD BE BF BG BH BI BJ BK BL BM BN BO BP BQ BR BS BT BU BV BW BX BY BZ | |
CA CB CC CD CE CF CG CH CI CJ CK CL CM CN CO CP CQ CR CS CT CU CV CW CX CY CZ | |
DA DB DC DD DE DF DG DH DI DJ DK DL DM DN DO DP DQ DR DS DT DU DV DW DX DY DZ | |
EA EB EC ED EE EF EG EH EI EJ EK EL EM EN EO EP EQ ER ES ET EU EV EW EX EY EZ | |
FA FB FC FD FE FF FG FH FI FJ FK FL FM FN FO FP FQ FR FS FT FU FV FW FX FY FZ | |
GA GB GC GD GE GF GG GH GI GJ GK GL GM GN GO GP GQ GR GS GT GU GV GW GX GY GZ | |
HA HB HC HD HE HF HG HH HI HJ HK HL HM HN HO HP HQ HR HS HT HU HV HW HX HY HZ | |
IA IB IC ID IE IF IG IH II IJ IK IL IM IN IO IP IQ IR IS IT IU IV IW IX IY IZ | |
JA JB JC JD JE JF JG JH JI JJ JK JL JM JN JO JP JQ JR JS JT JU JV JW JX JY JZ |
Aa Ab Ac Ad Ae Af Ag Ah Ai Aj Ak Al Am An Ao Ap Aq Ar As At Au Av Aw Ax Ay Az | |
Ba Bb Bc Bd Be Bf Bg Bh Bi Bj Bk Bl Bm Bn Bo Bp Bq Br Bs Bt Bu Bv Bw Bx By Bz | |
Ca Cb Cc Cd Ce Cf Cg Ch Ci Cj Ck Cl Cm Cn Co Cp Cq Cr Cs Ct Cu Cv Cw Cx Cy Cz | |
Da Db Dc Dd De Df Dg Dh Di Dj Dk Dl Dm Dn Do Dp Dq Dr Ds Dt Du Dv Dw Dx Dy Dz | |
Ea Eb Ec Ed Ee Ef Eg Eh Ei Ej Ek El Em En Eo Ep Eq Er Es Et Eu Ev Ew Ex Ey Ez | |
Fa Fb Fc Fd Fe Ff Fg Fh Fi Fj Fk Fl Fm Fn Fo Fp Fq Fr Fs Ft Fu Fv Fw Fx Fy Fz | |
Ga Gb Gc Gd Ge Gf Gg Gh Gi Gj Gk Gl Gm Gn Go Gp Gq Gr Gs Gt Gu Gv Gw Gx Gy Gz | |
Ha Hb Hc Hd He Hf Hg Hh Hi Hj Hk Hl Hm Hn Ho Hp Hq Hr Hs Ht Hu Hv Hw Hx Hy Hz | |
Ia Ib Ic Id Ie If Ig Ih Ii Ij Ik Il Im In Io Ip Iq Ir Is It Iu Iv Iw Ix Iy Iz | |
Ja Jb Jc Jd Je Jf Jg Jh Ji Jj Jk Jl Jm Jn Jo Jp Jq Jr Js Jt Ju Jv Jw Jx Jy Jz |
from bs4 import BeautifulSoup | |
import requests | |
import requests.exceptions | |
from urllib.parse import urlsplit | |
from urllib.parse import urlparse | |
from collections import deque | |
import re | |
url = "https://scrapethissite.com" | |
# a queue of urls to be crawled |
I hereby claim:
To claim this, I am signing this object: