- Quite docker desktop
- Open command prompt/powershell
- List packages (& take note of the name you wanna move, i.e. docker-desktop-data)
wsl --list -v
- Turn off wsl
wsl --shutdown
#Spider Websites with Wget – 20 Practical Examples | |
Wget is extremely powerful, but like with most other command line programs, the plethora of options it supports can be intimidating to new users. Thus what we have here are a collection of wget commands that you can use to accomplish common tasks from downloading single files to mirroring entire websites. It will help if you can read through the wget manual but for the busy souls, these commands are ready to execute. | |
1. Download a single file from the Internet | |
wget http://example.com/file.iso | |
2. Download a file but save it locally under a different name | |
wget ‐‐output-document=filename.html example.com |
# vim
sudo apt install vim curl wget git
At first buy a Raspberry Pi (I recommend Zero edition for minimum cost) with casing (official one is enough for Zero). My configuration:
-- Import(FROM) / Export(TO) CSV file from/into a table | |
-- Ref: https://www.postgresql.org/docs/current/sql-copy.html | |
-- If processing all columns, no need column specification | |
-- If CSV file don't include header, remove 'HEADER' from below query | |
-- For Export, can also specify Query, instead of Table & Column name | |
COPY table_name (column_1, column_2, column_3, column_5) | |
[FROM/TO] 'csv_file_location' DELIMITER ',' CSV HEADER QUOTE '"' ESCAPE '"' | |
-- Dump database on remote host to file | |
-- Ref: https://www.postgresql.org/docs/current/app-pgdump.html |