$ wget -e robots=off -r -np 'http://example.com/folder/'
- -e robots=off causes it to ignore robots.txt for that domain
- -r makes it recursive
- -np = no parents, so it doesn't follow links up to the parent folder
| /* gpxparse.c | |
| Copyright (c) 2010, Jeremiah LaRocco [email protected] | |
| Permission to use, copy, modify, and/or distribute this software for any | |
| purpose with or without fee is hereby granted, provided that the above | |
| copyright notice and this permission notice appear in all copies. | |
| THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES | |
| WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF |
$ wget -e robots=off -r -np 'http://example.com/folder/'
| <!DOCTYPE html> | |
| <html> | |
| <head> | |
| <meta charset="utf-8" /> | |
| <meta http-equiv="X-UA-Compatible" content="IE=edge" /> | |
| <meta name="viewport" content="width=device-width, initial-scale=1.0,maximum-scale=1.3, user-scalable=no" /> | |
| <title>跑步路线地图</title> | |
| <style> | |
| html,body, #map{ | |
| margin: 0; |
sudo apt purge ibus
sudo apt autoremove# Download deb installer from https://shurufa.sogou.com/linux
sudo dpkg -i <sogou_xxx.deb>
sudo apt install -f