Created
December 10, 2018 15:25
-
-
Save DoctahPopp871/79a57f744e287576ed24c0c3b8d002af to your computer and use it in GitHub Desktop.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
0.0 TEL | Telepresence 0.95 launched at Mon Dec 10 02:55:56 2018 | |
0.0 TEL | /usr/bin/telepresence --namespace humandb-01 --swap-deployment hdb-adv-api --expose 5500 --run yarn run dev | |
0.0 TEL | Platform: linux | |
0.0 TEL | Python 3.5.2 (default, Nov 12 2018, 13:43:14) | |
0.0 TEL | [GCC 5.4.0 20160609] | |
0.0 TEL | [1] Running: uname -a | |
0.0 1 | Linux ba215df0b568 4.9.93-linuxkit-aufs #1 SMP Wed Jun 6 16:55:56 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux | |
0.0 TEL | BEGIN SPAN main.py:47(main) | |
0.0 TEL | BEGIN SPAN startup.py:55(__init__) | |
0.0 TEL | [2] Capturing: kubectl version --short | |
0.8 TEL | [3] Capturing: kubectl config current-context | |
0.8 TEL | [4] Capturing: kubectl config view -o json | |
1.0 TEL | [5] Capturing: kubectl --context humandb-01 get ns humandb-01 | |
1.8 TEL | Command: kubectl 1.12.2 | |
1.8 TEL | Context: humandb-01, namespace: humandb-01, version: 1.10.11-eks | |
1.8 TEL | END SPAN startup.py:55(__init__) 1.8s | |
1.8 TEL | [6] Capturing: ssh -V | |
1.9 TEL | [7] Running: sudo -n echo -n | |
1.9 >>> | Starting proxy with method 'vpn-tcp', which has the following limitations: All processes are affected, only one telepresence can run per machine, and you can't use other VPNs. You may need to add cloud hosts and headless services with --also-proxy. For a full list of method limitations see https://telepresence.io/reference/methods.html | |
1.9 >>> | Volumes are rooted at $TELEPRESENCE_ROOT. See https://telepresence.io/howto/volumes.html for details. | |
1.9 TEL | [8] Capturing: kubectl --context humandb-01 --namespace humandb-01 get pods telepresence-connectivity-check --ignore-not-found | |
3.0 TEL | Scout info: {'notices': [], 'latest_version': '0.95', 'application': 'telepresence'} | |
3.0 TEL | BEGIN SPAN deployment.py:120(supplant_deployment) | |
3.0 TEL | BEGIN SPAN remote.py:78(get_deployment_json) | |
3.0 TEL | [9] Capturing: kubectl --context humandb-01 --namespace humandb-01 get deployment -o json --export hdb-adv-api | |
3.5 TEL | END SPAN remote.py:78(get_deployment_json) 0.5s | |
3.5 TEL | [10] Running: kubectl --context humandb-01 --namespace humandb-01 delete deployment hdb-adv-api-705b7d5d9b234ba7b66125f72a11063d --ignore-not-found | |
4.1 TEL | [11] Running: kubectl --context humandb-01 --namespace humandb-01 apply -f - | |
4.9 11 | deployment.extensions/hdb-adv-api-705b7d5d9b234ba7b66125f72a11063d created | |
4.9 TEL | [12] Running: kubectl --context humandb-01 --namespace humandb-01 scale deployment hdb-adv-api --replicas=0 | |
5.5 12 | deployment.extensions/hdb-adv-api scaled | |
5.5 TEL | END SPAN deployment.py:120(supplant_deployment) 2.5s | |
5.5 TEL | BEGIN SPAN remote.py:154(get_remote_info) | |
5.5 TEL | BEGIN SPAN remote.py:78(get_deployment_json) | |
5.5 TEL | [13] Capturing: kubectl --context humandb-01 --namespace humandb-01 get deployment -o json --export --selector=telepresence=705b7d5d9b234ba7b66125f72a11063d | |
6.1 TEL | END SPAN remote.py:78(get_deployment_json) 0.6s | |
6.1 TEL | Searching for Telepresence pod: | |
6.1 TEL | with name hdb-adv-api-705b7d5d9b234ba7b66125f72a11063d-* | |
6.1 TEL | with labels {'telepresence': '705b7d5d9b234ba7b66125f72a11063d', 'app': 'hdb-advaita-api'} | |
6.1 TEL | [14] Capturing: kubectl --context humandb-01 --namespace humandb-01 get pod -o json --export --selector=telepresence=705b7d5d9b234ba7b66125f72a11063d | |
6.9 TEL | Checking hdb-adv-api-705b7d5d9b234ba7b66125f72a11063d-756d578f8d-5646z | |
6.9 TEL | Looks like we've found our pod! | |
6.9 TEL | BEGIN SPAN remote.py:116(wait_for_pod) | |
7.0 TEL | [15] Capturing: kubectl --context humandb-01 --namespace humandb-01 get pod hdb-adv-api-705b7d5d9b234ba7b66125f72a11063d-756d578f8d-5646z -o json | |
7.6 TEL | END SPAN remote.py:116(wait_for_pod) 0.7s | |
7.6 TEL | END SPAN remote.py:154(get_remote_info) 2.1s | |
7.6 TEL | BEGIN SPAN connect.py:36(connect) | |
7.6 TEL | [16] Launching kubectl logs: kubectl --context humandb-01 --namespace humandb-01 logs -f hdb-adv-api-705b7d5d9b234ba7b66125f72a11063d-756d578f8d-5646z --container hdb-advaita-api --tail=10 | |
7.6 TEL | [17] Launching kubectl port-forward: kubectl --context humandb-01 --namespace humandb-01 port-forward hdb-adv-api-705b7d5d9b234ba7b66125f72a11063d-756d578f8d-5646z 42113:8022 | |
7.7 TEL | [18] Running: ssh -F /dev/null -q -oStrictHostKeyChecking=no -oUserKnownHostsFile=/dev/null -p 42113 telepresence@localhost /bin/true | |
7.7 TEL | [18] exit 255 in 0.03 secs. | |
7.9 TEL | [19] Running: ssh -F /dev/null -q -oStrictHostKeyChecking=no -oUserKnownHostsFile=/dev/null -p 42113 telepresence@localhost /bin/true | |
8.0 TEL | [19] exit 255 in 0.03 secs. | |
8.2 TEL | [20] Running: ssh -F /dev/null -q -oStrictHostKeyChecking=no -oUserKnownHostsFile=/dev/null -p 42113 telepresence@localhost /bin/true | |
8.2 TEL | [20] exit 255 in 0.01 secs. | |
8.3 16 | 2018-12-10T02:56:02+0000 [-] Loading ./forwarder.py... | |
8.3 16 | 2018-12-10T02:56:02+0000 [-] /etc/resolv.conf changed, reparsing | |
8.3 16 | 2018-12-10T02:56:02+0000 [-] Resolver added ('172.20.0.10', 53) to server list | |
8.3 16 | 2018-12-10T02:56:02+0000 [-] SOCKSv5Factory starting on 9050 | |
8.3 16 | 2018-12-10T02:56:02+0000 [socks.SOCKSv5Factory#info] Starting factory <socks.SOCKSv5Factory object at 0x7fa2811dca20> | |
8.3 16 | 2018-12-10T02:56:02+0000 [-] DNSDatagramProtocol starting on 9053 | |
8.3 16 | 2018-12-10T02:56:02+0000 [-] Starting protocol <twisted.names.dns.DNSDatagramProtocol object at 0x7fa2811dcdd8> | |
8.3 16 | 2018-12-10T02:56:02+0000 [-] Loaded. | |
8.3 16 | 2018-12-10T02:56:02+0000 [twisted.scripts._twistd_unix.UnixAppLogger#info] twistd 18.9.0 (/usr/bin/python3.6 3.6.5) starting up. | |
8.3 16 | 2018-12-10T02:56:02+0000 [twisted.scripts._twistd_unix.UnixAppLogger#info] reactor class: twisted.internet.epollreactor.EPollReactor. | |
8.5 17 | Forwarding from 127.0.0.1:42113 -> 8022 | |
8.5 TEL | [21] Running: ssh -F /dev/null -q -oStrictHostKeyChecking=no -oUserKnownHostsFile=/dev/null -p 42113 telepresence@localhost /bin/true | |
8.5 17 | Handling connection for 42113 | |
8.9 >>> | Forwarding remote port 5500 to local port 5500. | |
8.9 TEL | [22] Launching SSH port forward (exposed ports): ssh -N -oServerAliveInterval=1 -oServerAliveCountMax=10 -F /dev/null -q -oStrictHostKeyChecking=no -oUserKnownHostsFile=/dev/null -p 42113 telepresence@localhost -R '*:5500:127.0.0.1:5500' | |
8.9 17 | Handling connection for 42113 | |
8.9 >>> | | |
8.9 TEL | Launching Web server for proxy poll | |
8.9 TEL | [23] Launching SSH port forward (socks and proxy poll): ssh -N -oServerAliveInterval=1 -oServerAliveCountMax=10 -F /dev/null -q -oStrictHostKeyChecking=no -oUserKnownHostsFile=/dev/null -p 42113 telepresence@localhost -L127.0.0.1:42721:127.0.0.1:9050 -R9055:127.0.0.1:40131 | |
8.9 17 | Handling connection for 42113 | |
8.9 TEL | END SPAN connect.py:36(connect) 1.3s | |
8.9 TEL | BEGIN SPAN remote_env.py:28(get_remote_env) | |
8.9 TEL | [24] Capturing: kubectl --context humandb-01 --namespace humandb-01 exec hdb-adv-api-705b7d5d9b234ba7b66125f72a11063d-756d578f8d-5646z --container hdb-advaita-api -- python3 podinfo.py | |
10.0 TEL | [24] captured in 1.13 secs. | |
10.0 TEL | END SPAN remote_env.py:28(get_remote_env) 1.1s | |
10.0 TEL | BEGIN SPAN mount.py:32(mount_remote_volumes) | |
10.0 TEL | [25] Capturing: sshfs -p 42113 -F /dev/null -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null telepresence@localhost:/ /tmp/tel-ql_r_y12/fs | |
10.0 17 | Handling connection for 42113 | |
10.5 TEL | END SPAN mount.py:32(mount_remote_volumes) 0.4s | |
10.5 TEL | BEGIN SPAN vpn.py:235(connect_sshuttle) | |
10.5 TEL | BEGIN SPAN vpn.py:79(get_proxy_cidrs) | |
10.5 TEL | [26] Capturing: kubectl --context humandb-01 --namespace humandb-01 get nodes -o json | |
11.0 TEL | [27] Capturing: kubectl --context humandb-01 --namespace humandb-01 get services -o json | |
11.7 TEL | [28] Running: kubectl --context humandb-01 --namespace humandb-01 create service clusterip telepresence-1544410568-0244868-385 --tcp=3000 | |
12.2 28 | service/telepresence-1544410568-0244868-385 created | |
12.2 TEL | [29] Running: kubectl --context humandb-01 --namespace humandb-01 create service clusterip telepresence-1544410568-552292-385 --tcp=3000 | |
12.7 29 | service/telepresence-1544410568-552292-385 created | |
12.7 TEL | [30] Running: kubectl --context humandb-01 --namespace humandb-01 create service clusterip telepresence-1544410569-0480034-385 --tcp=3000 | |
13.3 30 | service/telepresence-1544410569-0480034-385 created | |
13.3 TEL | [31] Capturing: kubectl --context humandb-01 --namespace humandb-01 get services -o json | |
14.0 TEL | [32] Running: kubectl --context humandb-01 --namespace humandb-01 delete service telepresence-1544410568-0244868-385 | |
14.6 32 | service "telepresence-1544410568-0244868-385" deleted | |
14.6 TEL | [33] Running: kubectl --context humandb-01 --namespace humandb-01 delete service telepresence-1544410568-552292-385 | |
15.2 33 | service "telepresence-1544410568-552292-385" deleted | |
15.2 TEL | [34] Running: kubectl --context humandb-01 --namespace humandb-01 delete service telepresence-1544410569-0480034-385 | |
15.9 34 | service "telepresence-1544410569-0480034-385" deleted | |
15.9 >>> | Guessing that Services IP range is 172.20.0.0/16. Services started after this point will be inaccessible if are outside this range; restart telepresence if you can't access a new Service. | |
15.9 TEL | END SPAN vpn.py:79(get_proxy_cidrs) 5.5s | |
15.9 TEL | [35] Launching sshuttle: sshuttle-telepresence -v --dns --method nat -e 'ssh -oStrictHostKeyChecking=no -oUserKnownHostsFile=/dev/null -F /dev/null' --to-ns 127.0.0.1:9053 -r telepresence@localhost:42113 172.20.0.0/16 | |
15.9 TEL | BEGIN SPAN vpn.py:277(connect_sshuttle,sshuttle-wait) | |
15.9 TEL | [36] Capturing: python3 -c 'import socket; socket.gethostbyname('"'"'hellotelepresence0'"'"')' | |
16.0 36 | Traceback (most recent call last): | |
16.0 36 | File "<string>", line 1, in <module> | |
16.0 36 | socket.gaierror: [Errno -2] Name or service not known | |
16.0 TEL | [36] exit 1 in 0.06 secs. | |
16.1 TEL | [37] Capturing: python3 -c 'import socket; socket.gethostbyname('"'"'hellotelepresence1'"'"')' | |
16.1 37 | Traceback (most recent call last): | |
16.1 37 | File "<string>", line 1, in <module> | |
16.1 37 | socket.gaierror: [Errno -2] Name or service not known | |
16.1 TEL | [37] exit 1 in 0.05 secs. | |
16.2 TEL | [38] Capturing: python3 -c 'import socket; socket.gethostbyname('"'"'hellotelepresence2'"'"')' | |
16.3 38 | Traceback (most recent call last): | |
16.3 38 | File "<string>", line 1, in <module> | |
16.3 38 | socket.gaierror: [Errno -2] Name or service not known | |
16.3 TEL | [38] exit 1 in 0.05 secs. | |
16.4 35 | Starting sshuttle proxy. | |
16.4 TEL | [39] Capturing: python3 -c 'import socket; socket.gethostbyname('"'"'hellotelepresence3'"'"')' | |
16.4 39 | Traceback (most recent call last): | |
16.4 39 | File "<string>", line 1, in <module> | |
16.4 39 | socket.gaierror: [Errno -2] Name or service not known | |
16.4 TEL | [39] exit 1 in 0.04 secs. | |
16.5 TEL | [40] Capturing: python3 -c 'import socket; socket.gethostbyname('"'"'hellotelepresence4'"'"')' | |
16.6 40 | Traceback (most recent call last): | |
16.6 40 | File "<string>", line 1, in <module> | |
16.6 40 | socket.gaierror: [Errno -2] Name or service not known | |
16.6 TEL | [40] exit 1 in 0.04 secs. | |
16.7 TEL | [41] Capturing: python3 -c 'import socket; socket.gethostbyname('"'"'hellotelepresence5'"'"')' | |
16.7 41 | Traceback (most recent call last): | |
16.7 41 | File "<string>", line 1, in <module> | |
16.7 41 | socket.gaierror: [Errno -2] Name or service not known | |
16.7 TEL | [41] exit 1 in 0.05 secs. | |
16.8 35 | firewall manager: Starting firewall with Python version 3.5.2 | |
16.8 35 | firewall manager: ready method name nat. | |
16.8 35 | IPv6 enabled: False | |
16.8 35 | UDP enabled: False | |
16.8 35 | DNS enabled: True | |
16.8 35 | TCP redirector listening on ('127.0.0.1', 12300). | |
16.8 35 | DNS listening on ('127.0.0.1', 12300). | |
16.8 35 | Starting client with Python version 3.5.2 | |
16.8 35 | c : connecting to server... | |
16.8 17 | Handling connection for 42113 | |
16.8 TEL | [42] Capturing: python3 -c 'import socket; socket.gethostbyname('"'"'hellotelepresence6'"'"')' | |
16.9 35 | Warning: Permanently added '[localhost]:42113' (ECDSA) to the list of known hosts. | |
16.9 42 | Traceback (most recent call last): | |
16.9 42 | File "<string>", line 1, in <module> | |
16.9 42 | socket.gaierror: [Errno -2] Name or service not known | |
17.0 TEL | [42] exit 1 in 0.12 secs. | |
17.1 TEL | [43] Capturing: python3 -c 'import socket; socket.gethostbyname('"'"'hellotelepresence7'"'"')' | |
17.2 43 | Traceback (most recent call last): | |
17.2 43 | File "<string>", line 1, in <module> | |
17.2 43 | socket.gaierror: [Errno -2] Name or service not known | |
17.2 TEL | [43] exit 1 in 0.11 secs. | |
17.2 35 | Starting server with Python version 3.6.5 | |
17.2 35 | s: latency control setting = True | |
17.2 35 | s: available routes: | |
17.2 35 | c : Connected. | |
17.2 35 | firewall manager: setting up. | |
17.2 35 | >> iptables -t nat -N sshuttle-12300 | |
17.2 35 | >> iptables -t nat -F sshuttle-12300 | |
17.2 35 | >> iptables -t nat -I OUTPUT 1 -j sshuttle-12300 | |
17.2 35 | >> iptables -t nat -I PREROUTING 1 -j sshuttle-12300 | |
17.2 35 | >> iptables -t nat -A sshuttle-12300 -j RETURN --dest 127.0.0.1/32 -p tcp | |
17.3 35 | >> iptables -t nat -A sshuttle-12300 -j REDIRECT --dest 172.20.0.0/16 -p tcp --to-ports 12300 -m ttl ! --ttl 42 | |
17.3 35 | >> iptables -t nat -A sshuttle-12300 -j REDIRECT --dest 192.168.65.1/32 -p udp --dport 53 --to-ports 12300 -m ttl ! --ttl 42 | |
17.3 35 | >> iptables -t nat -A sshuttle-12300 -j REDIRECT --dest 224.0.0.252/32 -p udp --dport 5355 --to-ports 12300 -m ttl ! --ttl 42 | |
17.3 35 | conntrack v1.4.3 (conntrack-tools): 0 flow entries have been deleted. | |
17.3 TEL | [44] Capturing: python3 -c 'import socket; socket.gethostbyname('"'"'hellotelepresence8'"'"')' | |
17.3 35 | c : DNS request from ('172.17.0.2', 51075) to None: 47 bytes | |
17.3 16 | 2018-12-10T02:56:13+0000 [stdout#info] Set DNS suffix we filter out to: [(b'ogilvy', b'com')] | |
17.3 16 | 2018-12-10T02:56:13+0000 [stdout#info] Result for b'hellotelepresence8.ogilvy.com' is ['127.0.0.1'] | |
17.4 TEL | [45] Capturing: python3 -c 'import socket; socket.gethostbyname('"'"'hellotelepresence9'"'"')' | |
17.5 35 | c : DNS request from ('172.17.0.2', 46202) to None: 47 bytes | |
17.5 16 | 2018-12-10T02:56:13+0000 [stdout#info] Result for b'hellotelepresence9.ogilvy.com' is ['127.0.0.1'] | |
17.6 TEL | [46] Capturing: python3 -c 'import socket; socket.gethostbyname('"'"'hellotelepresence10'"'"')' | |
17.7 35 | c : DNS request from ('172.17.0.2', 35206) to None: 48 bytes | |
17.7 16 | 2018-12-10T02:56:13+0000 [stdout#info] Result for b'hellotelepresence10.ogilvy.com' is ['127.0.0.1'] | |
17.8 TEL | END SPAN vpn.py:277(connect_sshuttle,sshuttle-wait) 1.9s | |
17.8 TEL | END SPAN vpn.py:235(connect_sshuttle) 7.3s | |
17.8 TEL | CRASH: [Errno 2] No such file or directory: 'yarn' | |
17.8 TEL | Traceback (most recent call last): | |
17.8 TEL | File "/usr/bin/telepresence/telepresence/cli.py", line 130, in crash_reporting | |
17.8 TEL | yield | |
17.8 TEL | File "/usr/bin/telepresence/telepresence/main.py", line 84, in main | |
17.8 TEL | runner, remote_info, env, socks_port, ssh, mount_dir | |
17.8 TEL | File "/usr/bin/telepresence/telepresence/outbound/setup.py", line 61, in launch | |
17.8 TEL | runner_, remote_info, command, args.also_proxy, env, ssh | |
17.8 TEL | File "/usr/bin/telepresence/telepresence/outbound/local.py", line 122, in launch_vpn | |
17.8 TEL | process = Popen(command, env=env) | |
17.8 TEL | File "/usr/lib/python3.5/subprocess.py", line 947, in __init__ | |
17.8 TEL | restore_signals, start_new_session) | |
17.8 TEL | File "/usr/lib/python3.5/subprocess.py", line 1551, in _execute_child | |
17.8 TEL | raise child_exception_type(errno_num, err_msg) | |
17.8 TEL | FileNotFoundError: [Errno 2] No such file or directory: 'yarn' | |
17.8 TEL | (calling crash reporter...) | |
29.5 >>> | Exit cleanup in progress | |
29.5 TEL | (Cleanup) Kill BG process [35] sshuttle | |
29.5 TEL | (Cleanup) Unmount remote filesystem | |
29.5 TEL | [47] Running: fusermount -z -u /tmp/tel-ql_r_y12/fs | |
29.5 35 | >> iptables -t nat -D OUTPUT -j sshuttle-12300 | |
29.5 35 | >> iptables -t nat -D PREROUTING -j sshuttle-12300 | |
29.5 35 | >> iptables -t nat -F sshuttle-12300 | |
29.5 35 | >> iptables -t nat -X sshuttle-12300 | |
29.5 35 | firewall manager: Error trying to undo /etc/hosts changes. | |
29.5 35 | firewall manager: ---> Traceback (most recent call last): | |
29.5 35 | firewall manager: ---> File "/root/.pex/install/sshuttle_telepresence-0.78.2.dev45+gd250ccb-py2.py3-none-any.whl.73b9c6a0c49d6b6bf7533478d454b7be51b9d990/sshuttle_telepresence-0.78.2.dev45+gd250ccb-py2.py3-none-any.whl/sshuttle/firewall.py", line 274, in main | |
29.5 35 | firewall manager: ---> restore_etc_hosts(port_v6 or port_v4) | |
29.5 35 | firewall manager: ---> File "/root/.pex/install/sshuttle_telepresence-0.78.2.dev45+gd250ccb-py2.py3-none-any.whl.73b9c6a0c49d6b6bf7533478d454b7be51b9d990/sshuttle_telepresence-0.78.2.dev45+gd250ccb-py2.py3-none-any.whl/sshuttle/firewall.py", line 50, in restore_etc_hosts | |
29.5 35 | firewall manager: ---> rewrite_etc_hosts({}, port) | |
29.5 35 | firewall manager: ---> File "/root/.pex/install/sshuttle_telepresence-0.78.2.dev45+gd250ccb-py2.py3-none-any.whl.73b9c6a0c49d6b6bf7533478d454b7be51b9d990/sshuttle_telepresence-0.78.2.dev45+gd250ccb-py2.py3-none-any.whl/sshuttle/firewall.py", line 29, in rewrite_etc_hosts | |
29.5 35 | firewall manager: ---> os.link(HOSTSFILE, BAKFILE) | |
29.6 35 | firewall manager: ---> OSError: [Errno 18] Invalid cross-device link: '/etc/hosts' -> '/etc/hosts.sbak' | |
29.6 TEL | (Cleanup) Kill BG process [23] SSH port forward (socks and proxy poll) | |
29.6 TEL | [23] exit 0 | |
29.6 TEL | (Cleanup) Kill Web server for proxy poll | |
29.6 TEL | [35] exit -15 | |
29.9 TEL | (Cleanup) Kill BG process [22] SSH port forward (exposed ports) | |
29.9 TEL | [22] exit 0 | |
29.9 TEL | (Cleanup) Kill BG process [17] kubectl port-forward | |
30.0 TEL | [17] exit -15 | |
30.0 TEL | (Cleanup) Kill BG process [16] kubectl logs | |
30.0 TEL | (Cleanup) Re-scale original deployment | |
30.0 TEL | [16] exit -15 | |
30.0 TEL | [48] Running: kubectl --context humandb-01 --namespace humandb-01 scale deployment hdb-adv-api --replicas=1 | |
30.6 48 | deployment.extensions/hdb-adv-api scaled | |
30.6 TEL | (Cleanup) Delete new deployment | |
30.6 TEL | [49] Running: kubectl --context humandb-01 --namespace humandb-01 delete deployment hdb-adv-api-705b7d5d9b234ba7b66125f72a11063d | |
31.2 49 | deployment.extensions "hdb-adv-api-705b7d5d9b234ba7b66125f72a11063d" deleted | |
31.3 TEL | (Cleanup) Kill sudo privileges holder | |
31.3 TEL | (Cleanup) Stop time tracking | |
31.3 TEL | END SPAN main.py:47(main) 31.3s | |
31.3 TEL | (Cleanup) Remove temporary directory | |
31.3 TEL | (Cleanup) Remove temporary directory failed: | |
31.3 TEL | (Cleanup) lstat: illegal type for path parameter | |
31.3 TEL | (Cleanup) Save caches | |
32.0 TEL | (sudo privileges holder thread exiting) |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment