ctrl + A => Beginning of line
ctrl + E => End of line
ctrl + L => Clears screen
ctrl + U => Clears before cursor
ctrl + H => Backspace
ctrl + R => Search previous commands
ctrl + C => Kill task
ctrl + D => Exit shell
ctrl + Z => Puts running task in background (fg restores it)
ctrl + W => Delete the word before cursor
ctrl + T => Swaps last two characters before cursor
esc + T  => Swaps last two words before cursor
With piping you can chain commands together, passing output from one to the other.
Example #1
Use case: Whenever you need to look at a large human-readable file and don't fancy scrolling in the terminal window.
Description: Pipe output to less to print text in screen-sized chunks. Navigate with keys d and b for page up and down. For example:
$ php -i | lessExample #2
Use case: Whenever you want to search inside the result of another command. Here we want to find out what our maximum upload filesize is in PHP. Common for platforms like Wordpress where you upload a zipped plugin/extension, or if you're using PHPMyAdmin and want to import a database.
Descripiton: Pipe output to grep to find text
$ php -i | grep upload
=>
..
file_uploads => On => On
max_file_uploads => 20 => 20
upload_max_filesize => 2M => 2M
..Hopefully from the above two examples you can see the flexibility of pipes! Really a must-know for command line junkies.
With
&∨you can perform multiple commands in one line.
With && the previous command must return exit status zero, in other words not having any errors. It's like saying, do this command and if successful (&&) do this next command.
Using ; the command will run whatever the result of the previous command (I guess unless the previous command logs you out!).
Example #1
# we want to create a directory and quickly move to that directory
$ mkdir all/new/directory && cd all/new/directory
# outputs:
mkdir: cannot create directory 'all/new/directory': No such file or directoryIn this case, the mkdir failed (because we didn't pass in the -p flag, which creates intermediary directories if they don't exist) and so the next command didn't run at all. If we tried with ;, we get two errors because mkdir can't make the directory and we can't cd into it because it doesn't exist!
$ mkdir all/new/directory ; cd all/new/directory
# outputs:
mkdir: cannot create directory ‘all/new/directory’: No such file or directory
-bash: cd: all/new/directory: No such file or directoryExample 2
# we want to copy a file and then delete the original
$ cp some_file fake/folder/copy_some_file ; rm -f some_file
# whoops! let's hope that wasn't importantCan you see the problem with the above command? We tried to neatly copy a file into a folder and then delete it, but that folder doesn't exist! With the semicolon the remove command just deleted our file regardless, buggar! We should have used the && command as that would mean the removal would only take place when no errors have occurred. Although this is a bit of a contrived example, as I probably wouldn't recommend doing this anyway, but it's a little safer..
Essentially, use
&&if you want the next command to only run if the previous one has been successful, or given no errors.
Private keys permissions should be 600, in a hidden folder
If for some reason the permissions don't let you access .ssh/ folder to grab a key, then you can run this:
eval "$(ssh-user -s)"
ssh-add ./path/to/key
catprints a file to the screen
$ cat path/to/file.extOptions
-v Shows line numbers
Change mode. Changes the file permissions of a file or folder.
Three types of user:
  | Owner       | Group       | World
Three types of permission:
  | Read (r)    | Write (w)   | Execute (x)
In numerical format:
  | 4           | 2           | 1
Copies files
Options
-a     # improved recursive option
Tips
- Add a .after the source to copy all files and foulders, including hidden ones
- Use square brackets to do numerical increments e.g. cp dir/file[0-9]
- Use curly braces to transfer more than one file e.g. cp dir/{file1, file2}
- cp will create destination folder if it doesn't exist already
Used for transferring data, usually website requests
Options
-d    # add POST data to the request, like if you were filling out a web form
-f    # fail silently
-G    # send a GET request with data instead of a POST request (data is sent as params in url)
-H    # add custom header information
-i    # include header information in response
-I    # only return the header
-L    # follow redirects
-o <file>           # output to file
-u <user:password>  # specify a username and password for server authentication
-v # verboseFinds files
Options
-type <f|d>   # specificy whether you want to find files or directories
-not          # will exclude matching valueExamples
$ find dir/ectory/ -name "findme"
$ find dir/ectory/ -type f -not -path "*something/ignored"
$ grep -r -H "findme" | head -n 1
  # Searches recursively in human-readable format for "findme" in the first line of files$ ls <dir>
  -a    # shows hidden files
  -l    # shows extra info (owner, size, modifed etc.)
  -h    # human readable (converts bytes to MB etc.)
Prints the contents of a directory.
$ pwdPrints the working directory
$ mkdir example
$ mkdir -p folders/dont/exist/exampleCreates a new folder. Pass -p to create intermediary folders.
Used for transferring files. Especially handy for transferring only the differences between two sets of files.
$ rsync -avz <from/directory> <to@remoteserver:/directory>
    -a    # archive mode, essentailly transfer everything (recursion)
    -v    # increase verbosity
    -n    # dry run
    -u    # ignore any files that already exist
    -z    # compression
Removes files.
$ rm <file|directory>
    -f  # forces delete without confirmation
    -r  # deletes directories
Securely copies files
$ scp something/local [email protected]:dest/dirCompress files or folders
-c    # create a new archive
-v    # verbose
-f    # archive mode
-x    # extract archive
-z    # use gzip compression
--exclude=PATTERN   # exclude files
$ tar -cvzf <name.tar.gz> <path>  # compress the path. note, -f option must be followed by name of file
$ tar -xvzf <name.tar.gz>         # open it up
Displays a number of lines from the end of a file, useful for logs.
$ tail -n 100 path/to/file      # will display the last 100 lines
$ tail -f path/to/file          # 'follow' the log, meaning you can view errors occurring in real-timeTruncates a file to 0 bytes:
cat /dev/null > path/to/fileTo empty all .log files in a folder:
for f in *.log; do cat /dev/null > $f; done