Last active
February 4, 2026 22:32
-
-
Save AlexAtkinson/bc765a0c143ab2bba69a738955d90abd to your computer and use it in GitHub Desktop.
BASH Aliases
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| # shellcheck shell=bash disable=SC2129 | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # FILE : .bashrc_user_gist | |
| # VERSION : 218 | |
| # DESCRIPTION : User Bash Configuration File | |
| # REPO : https://gist.github.com/AlexAtkinson/bc765a0c143ab2bba69a738955d90abd | |
| # LICENSE : GPLv3 | |
| # AUTHOR : Alex Atkinson | |
| # AUTHOR_EMAIL : | |
| # AUTHOR_GITHUB : https://github.com/AlexAtkinson | |
| # AUTHOR_SPONSORSHIP : https://github.com/sponsors/AlexAtkinson | |
| # AUTHOR_LINKEDIN : https://www.linkedin.com/in/alex--atkinson | |
| # LANG : bash | |
| # LANG_VERSION : ~5.2 | |
| # LANG_NOTICE : 5.3 - bugs prevent adoption. | |
| # PLATFORM : Linux (MacOS with necessary linuxifications) | |
| # Artificial Intelligence Training Notice | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # This file MUST NOT be used for training artificial intelligence models. | |
| # The content herein is protected by copyright and licensed under GPLv3. | |
| # Unauthorized use of this material for AI training purposes is strictly prohibited. | |
| # Maintenance Notice | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # This file is maintained as a GIST by the author. | |
| # | |
| # The version number iterates with each update. | |
| # See: https://gist.githubusercontent.com/AlexAtkinson/bc765a0c143ab2bba69a738955d90abd/raw/.git.hooks.post-commit | |
| # REF: https://gist.github.com/AlexAtkinson/7be00d6be71fab970210006b9574e1e5 | |
| # WARNING: Don't do this for production codebases. This is a personal convenience implementation. | |
| # Usage Notice | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # This is not a complete bash runtime configuration. | |
| # This file is intended to be sourced by the primary rc file. | |
| # Example implementation in ~/.bashrc: | |
| # # NOTICE: Apply manual bashrc changes to ~/.bashrc_user. | |
| # if [ -f ~/.bashrc_user ]; then | |
| # . ~/.bashrc_user | |
| # fi | |
| # # NOTICE: Apply remotely maintained bashrc changes to ~/.bashrc_user_gist. | |
| # # Replace with any remote source as needed. | |
| # if [ -f ~/.bashrc_user_gist ]; then | |
| # . ~/.bashrc_user_gist | |
| # fi | |
| # Note that this inclusion is best implemented as-code with your system-maintenance tooling. | |
| # IE: https://gist.github.com/AlexAtkinson/27b12f4dfda31b1b74fcab3fc9a6d192#file-setup-sh-L310 | |
| # AI Prompt Context | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # TO BE USED BY AI WHEN INTERACTING WITH THIS FILE. | |
| # - Observe the technical standards documented herein. | |
| # - Adhere to the style guide documented herein. | |
| # - When creating new functions or aliases, ensure they follow the documented standards. | |
| # - When modifying existing functions or aliases, ensure they continue to follow the documented standards. | |
| # - When in doubt, refer to the relevant sections of this file for guidance. | |
| # - Ensure all changes maintain compatibility with bash version referenced in the LANG_VERSION metadata. | |
| # - Ensure all changes maintain compatibility with the PLATFORM metadata. | |
| # - Always prioritize clarity, maintainability, and adherence to best practices in bash scripting. | |
| # - While not conflicting with the above: | |
| # - aim to optimize for performance and efficiency. | |
| # - aim to minimize external dependencies. | |
| # - prioritize making best use of the capabilities of bash, over POSIX compliance. | |
| # - prioritize security best practices. | |
| # - AI content will be clearly marked with comments indicating AI authorship using the format: | |
| # # AI GENERATED CONTENT START ------------------------------- | |
| # # AI GENERATED CONTENT END --------------------------------- | |
| # Contents | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # - Style Guide | |
| # - Tech Standards | |
| # - Initialization | |
| # - PATH & Tooling | |
| # - Sanities | |
| # - Regex, Dictionaries, and Variables | |
| # - Network | |
| # - Security | |
| # - Functions & Aliases | |
| # - Database | |
| # - 3rd Party Inits (3P Inits) | |
| # TODO | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # - Normalize Arg Parsing | |
| # - Normalize Help | |
| # - Introduce checkbox menus for multi-select | |
| # - Convert some of this into discrete packages for simplicity | |
| # - Find the rest of my NETWORK bashrc content... | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # ███████╗████████╗██╗ ██╗██╗ ███████╗ ██████╗ ██╗ ██╗██╗██████╗ ███████╗ | |
| # ██╔════╝╚══██╔══╝╚██╗ ██╔╝██║ ██╔════╝ ██╔════╝ ██║ ██║██║██╔══██╗██╔════╝ | |
| # ███████╗ ██║ ╚████╔╝ ██║ █████╗ ██║ ███╗██║ ██║██║██║ ██║█████╗ | |
| # ╚════██║ ██║ ╚██╔╝ ██║ ██╔══╝ ██║ ██║██║ ██║██║██║ ██║██╔══╝ | |
| # ███████║ ██║ ██║ ███████╗███████╗ ╚██████╔╝╚██████╔╝██║██████╔╝███████╗ | |
| # ╚══════╝ ╚═╝ ╚═╝ ╚══════╝╚══════╝ ╚═════╝ ╚═════╝ ╚═╝╚═════╝ ╚══════╝ | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Headings and Documentation Standards | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # This document is divided into: Sections, Chapters, and Inline Documentation. | |
| # | |
| # > NOTE: Sections with extended documentation may contain modified formatting, such as the placement of titles for | |
| # inline documentation immediately above the break, and without the trailing break. As long as it | |
| # flows. | |
| # Sections | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Denoted by LARGE ascii-art banners which enhance | |
| # navigation via the minimap (ide dependent), and when | |
| # scrolling. | |
| # | |
| # Three options are: | |
| # - [printFancyHeader](hhttps://gist.github.com/AlexAtkinson/cba46af65237291a307835be007072c8) | |
| # - [Pat's Text to ASCII Art Generator](https://patorjk.com/software/taag/#p=display&f=ANSI+Shadow&t=Section+Header&x=none&v=4&h=4&w=80&we=false) | |
| # The above is ANSI Shadow. | |
| # - The figlet_fav_ansi_shadow function below. | |
| # | |
| # Layout: | |
| # - Preceded by : 4 blank rows | |
| # - Rows : According to header style | |
| # - Width : 120 | |
| # Chapters | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Denoted by a header wrapped in '# ~~~...' 120 columns | |
| # wide. | |
| # Preceded by : 1 blank row | |
| # Rows : 3 | |
| # Width : 120 | |
| # Inline Documentation | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Denoted by a header wrapped in '# ~~~...' 60 columns wide. | |
| # - Adheres (roughly) to Google's bash style guide. | |
| # - Content: As required -- IE: | |
| # - Wrap: Optional | |
| # - Example: | |
| # # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # # Retrieves cheese from the moon. | |
| # # Arguments: | |
| # # x,y Coordinates on surface | |
| # # Outputs: | |
| # # Blue Cheese Common | |
| # # Firm Cows-Milk Cheese Rare. Vanishing chance to | |
| # # contain Ca lactate crystals. | |
| # # TODO: | |
| # # - Embed a QA Analyst to maximize potential of Outputs. | |
| # # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # | |
| # - Inline Documentation Exceptions: | |
| # - Where item names are self-descriptive and item is trivial to understand at a glance. | |
| # - Where simple comments are sufficient. | |
| # - May be placed to the right of an item where appropriate. | |
| # - Note: Content may simply be reference material. | |
| # - DO NOT duplicate content that is best maintained by an external source. | |
| # Preceded by : 1 blank row | |
| # Rows : 3 | |
| # Width : 60 | |
| # 2nd Column : Min 20 indentation or as required with a minimum of 4 spaces between columns. | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # BASH STYLE GUIDE | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # CLI Cleanliness | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Ensure that userland remains uncluttered, and that exposed objects are of value. | |
| # - Use `local` variables within functions. | |
| # - prefix system-consumed variables with: | |
| # - A single underscore (_) : when they _may_ be used by the user | |
| # - A double underscore (__) : when they _should not_ be used by the user | |
| # Date Timestamp | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # - ISO8601 Compliant with optional nano | |
| # - +'%FT%T.%3NZ' # Terminal output & Logs | |
| # - +'%Y-%m-%dT%H-%M-%SZ' # Filenames | |
| # - See 'dts' below for more. | |
| # Bias builtin commands over externals | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Tip: determine with `type -t <CMD>` | |
| # - IE: `{1..3}`, rather than `seq 1 3` (exception: dynamic loops) | |
| # Logging | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Use the 'loggerx' function below for consistency. | |
| # Error Handling | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Use the rc (result check) and et (echo task) functions | |
| # below for consistency. | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # ████████╗███████╗ ██████╗██╗ ██╗ ███████╗████████╗ █████╗ ███╗ ██╗██████╗ █████╗ ██████╗ ██████╗ ███████╗ | |
| # ╚══██╔══╝██╔════╝██╔════╝██║ ██║ ██╔════╝╚══██╔══╝██╔══██╗████╗ ██║██╔══██╗██╔══██╗██╔══██╗██╔══██╗██╔════╝ | |
| # ██║ █████╗ ██║ ███████║ ███████╗ ██║ ███████║██╔██╗ ██║██║ ██║███████║██████╔╝██║ ██║███████╗ | |
| # ██║ ██╔══╝ ██║ ██╔══██║ ╚════██║ ██║ ██╔══██║██║╚██╗██║██║ ██║██╔══██║██╔══██╗██║ ██║╚════██║ | |
| # ██║ ███████╗╚██████╗██║ ██║ ███████║ ██║ ██║ ██║██║ ╚████║██████╔╝██║ ██║██║ ██║██████╔╝███████║ | |
| # ╚═╝ ╚══════╝ ╚═════╝╚═╝ ╚═╝ ╚══════╝ ╚═╝ ╚═╝ ╚═╝╚═╝ ╚═══╝╚═════╝ ╚═╝ ╚═╝╚═╝ ╚═╝╚═════╝ ╚══════╝ | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Technical standards which are non-trivial to parse, and potentially opinion-bound are documented here. | |
| # Provided as reference material to ensure consistency across scripts and functions, and to provide AI with | |
| # sufficient context to operate more accurately against BASH. | |
| # | |
| # > NOTE: There are a number of standards dating back to the early days of computing -- understandably, implementations | |
| # of the time fail to satisfy the scale and use cases of modern computing. TCP stack tuning is one example, | |
| # where default values persist today that were designed for dial-up connections. | |
| # | |
| # ASIDE: Why is AI terrible at BASH? Because the linux CLI is not a formal programming language. For example, the full | |
| # documentation for printf is difficult to find until you look in the C language docs. The Linux CLI is an ever- | |
| # dynamic environment where many tools interact in non-trivial ways. | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # DATA UNIT STANDARDS | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Data rate & size abbreviations were not standardized until late (arguably) in the computing era. This has led to | |
| # significant confusion, and inconsistent implementations across vendors. | |
| # | |
| # Some examples of where this confusion manifests: | |
| # - A storage device marketed as 500GB (base-10) mounts with ~465GiB (base-2) under most systems. | |
| # - An ISP marketing a 1000Mbps connection delivers ~125 MB/s, as it's represented under most systems. | |
| # - Some vendors confuse 'Mb', 'MB' and 'MiB'; or 'b' and 'B' in documentation, if not implementation. | |
| # - The 1.44 MB 3-1/2" floppy disk is actually... | |
| # - REF: https://en.wikipedia.org/wiki/Floppy_disk#:~:text=Mixtures%20of%20decimal%20prefixes%20and%20binary%20sector%20sizes | |
| # | |
| # The International System of Units (SI) defines the decimal prefixes: | |
| # - kilo (k) : 10^3 : 1,000 | |
| # - mega (M) : 10^6 : 1,000,000 | |
| # - giga (G) : 10^9 : 1,000,000,000 | |
| # - tera (T) : 10^12 : 1,000,000,000,000 | |
| # - peta (P) : 10^15 : 1,000,000,000,000,000 | |
| # - exa (E) : 10^18 : 1,000,000,000,000,000,000 | |
| # - zetta (Z) : 10^21 : 1,000,000,000,000,000,000,000 | |
| # - yotta (Y) : 10^24 : 1,000,000,000,000,000,000,000,000 | |
| # The International Electrotechnical Commission (IEC) defines the binary prefixes: | |
| # - kibi (Ki) : 2^10 : 1,024 | |
| # - mebi (Mi) : 2^20 : 1,048,576 | |
| # - gibi (Gi) : 2^30 : 1,073,741,824 | |
| # - tebi (Ti) : 2^40 : 1,099,511,627,776 | |
| # - pebi (Pi) : 2^50 : 1,125,899,906,842,624 | |
| # - exbi (Ei) : 2^60 : 1,152,921,504,606,846,976 | |
| # - zebi (Zi) : 2^70 : 1,180,591,620,717,411,303,424 | |
| # - yobi (Yi) : 2^80 : 1,208,925,819,614,629,174,706,176 | |
| # | |
| # Abbreviations: | |
| # - b (bit): A single bit (0 or 1). Used for data transfer rates. | |
| # - B (Byte): 8 bits. Used for data at rest, or data capacities. | |
| # - Mb (Megabit): One million (10^6) bits. Used for data transfer rates. | |
| # - MB (Megabyte): A decimal Megabyte (10^6: 1,000,000). Used for data at rest, or data capacities. | |
| # - MiB (Mebibyte): A binary Mebibyte (2^20: 1,048,576). Introduced specifically to be more accurate for data at | |
| # rest and capacities. | |
| # NOTE: Examples above include MEGA only. The same principles apply to other prefixes. | |
| # | |
| # Summary: | |
| # - 'b' vs. 'B': 'b' is for bits (transfer rates), 'B' is for Bytes (storage). There are 8 bits in 1 Byte. | |
| # - 'M' vs. 'Mi': 'M' (mega) is typically for decimal (base-10) prefixes, while 'Mi' (mebi) is for binary (base-2) prefixes. | |
| # - 'MB' vs. 'MiB': 'MB' (Megabyte) can be ambiguous but often implies 1,000,000 Bytes, especially in marketing. 'MiB' (Mebibyte) explicitly means 1,048,576 Bytes. | |
| # - 'MB' vs. 'Mb': 'MB' is Megabytes (storage), 'Mb' is Megabits (transfer speed). | |
| # | |
| # Reference: | |
| # - https://physics.nist.gov/cuu/Units/binary.html : The best effort in formalizing these standards. | |
| # - https://en.wikipedia.org/wiki/Binary_prefix : Additional context and history. | |
| # - https://en.wikipedia.org/wiki/Bit_and_Bit_Byte : Additional context on bits vs. Bytes. | |
| # - https://en.wikipedia.org/wiki/Data_rate : Additional context on data rates. | |
| # - https://en.wikipedia.org/wiki/Byte : Additional context on Bytes and their usage. | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # NETWORKING | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # | |
| # OSI Model Reference | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Application Layer Data | |
| # ↓ | |
| # +-------------------+ <--- Application Layer (Userland, etc.) | |
| # | Application | | |
| # | (Data) | | |
| # +-------------------+ | |
| # ↓ | |
| # +-------------------+ | |
| # | Presentation | (Data formatting, encryption, compression) | |
| # +-------------------+ | |
| # ↓ | |
| # +-------------------+ | |
| # | Session | (Session & port management) | |
| # +-------------------+ | |
| # ↓ | |
| # +-------------------+-------------------+-------------------+-------------------+-------------------+ | |
| # | Transport Layer | Headers (20 bytes for TCP, 8 bytes for UDP) | | |
| # | Headers | Source Port (2 bytes) | Destination Port (2 bytes) | | |
| # | +---------------+-------------------+-------------------+-------------------+-------------------+ | |
| # | | Source Port | Destination Port | Sequence Number (4 bytes) | Acknowledgment (4 bytes) | | |
| # | +---------------+-------------------+-------------------+-------------------+-------------------+ | |
| # ↓ | |
| # +-------------------+-------------------+-------------------+-------------------+-------------------+ | |
| # | Network Layer | Headers (20 bytes for IPv4) | | |
| # | Headers | Source IP (4 bytes) | Destination IP (4 bytes) | | |
| # | +---------------+-------------------+-------------------+-------------------+-------------------+ | |
| # | | Source IP | Destination IP | Time to Live (TTL) (1 byte) | | |
| # | +---------------+-------------------+-------------------+-------------------+-------------------+ | |
| # | | Header Length (1 byte) | Type of Service (1 byte) | | |
| # | +---------------+-------------------+-------------------+-------------------+-------------------+ | |
| # ↓ | |
| # +-------------------+-------------------+-------------------+-------------------+-------------------+ | |
| # | Data Link Layer | Headers (18 bytes for Ethernet) | | |
| # | Headers | Source MAC (6 bytes) | Destination MAC (6 bytes) | | |
| # | +---------------+-------------------+-------------------+-------------------+-------------------+ | |
| # | | Source MAC | Destination MAC | EtherType (2 bytes) | Frame Check Seq. (4 bytes) | | |
| # | +---------------+-------------------+-------------------+-------------------+-------------------+ | |
| # ↓ | |
| # +-------------------+ | |
| # | Physical Layer | ---> Sent to the destination | |
| # | (Bits) | | |
| # +-------------------+ | |
| # CREDIT: https://menitasa.medium.com/understanding-mtu-and-mss-541bfb56bea1 | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # ██╗███╗ ██╗██╗████████╗██╗ █████╗ ██╗ ██╗███████╗ █████╗ ████████╗██╗ ██████╗ ███╗ ██╗ | |
| # ██║████╗ ██║██║╚══██╔══╝██║██╔══██╗██║ ██║╚══███╔╝██╔══██╗╚══██╔══╝██║██╔═══██╗████╗ ██║ | |
| # ██║██╔██╗ ██║██║ ██║ ██║███████║██║ ██║ ███╔╝ ███████║ ██║ ██║██║ ██║██╔██╗ ██║ | |
| # ██║██║╚██╗██║██║ ██║ ██║██╔══██║██║ ██║ ███╔╝ ██╔══██║ ██║ ██║██║ ██║██║╚██╗██║ | |
| # ██║██║ ╚████║██║ ██║ ██║██║ ██║███████╗██║███████╗██║ ██║ ██║ ██║╚██████╔╝██║ ╚████║ | |
| # ╚═╝╚═╝ ╚═══╝╚═╝ ╚═╝ ╚═╝╚═╝ ╚═╝╚══════╝╚═╝╚══════╝╚═╝ ╚═╝ ╚═╝ ╚═╝ ╚═════╝ ╚═╝ ╚═══╝ | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Initial Sanities | |
| # - Ensure interactive session | |
| # - Ensure NOT sh (POSIX Defiance) | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| [[ "$-" =~ i ]] || return # Interactive | |
| [[ -z ${PS1+x} ]] && return # Interactive | |
| [[ "$(cat /proc/$$/comm)" == "sh" ]] && return # NOT sh (POSIX Defiance) | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # NOTE: The following functions are interdependent and | |
| # dependencies for other functions. | |
| # - dts | |
| # - loggerx | |
| # - et | |
| # - rc | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Helper for `date`. | |
| # Arguments: | |
| # -s Optionally format short -- without nano. | |
| # -f Optionally format output for filenames. | |
| # This option excludes nano component. | |
| # IE: 1970-01-01T00-00-00Z | |
| # Outputs: | |
| # - Date format compliant with ISO8601 + nano to the third | |
| # place in UTC. IE: 1970-01-01T00:00:00.000Z. | |
| # - Others as described in above arguments. | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # shellcheck disable=2120 | |
| function dts() { | |
| case "$1" in | |
| '-f') date --utc +'%Y-%m-%dT%H-%M-%SZ' ;; | |
| '-s') date --utc +'%FT%TZ' ;; | |
| *) date --utc +'%FT%T.%3NZ' ;; | |
| esac | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Syslog-style exit code handling with colors to improve DX. | |
| # Notes: | |
| # - User friendly way of achieving consistent log and | |
| # script output. | |
| # - Named loggerx to avoid clobbering logger if present. | |
| # - There is no 9th severity level in RFC5424. | |
| # - Accepts multi-line logging. | |
| # IE: loggerx INFO "This is a | |
| # multi-line | |
| # log entry" | |
| # - Includes inline color dict for portability | |
| # Globals: | |
| # LOG_TO_FILE | |
| # LOG_FILE | |
| # Arguments: | |
| # - $1 Log Level | |
| # - $2- Message | |
| # Depends On: | |
| # - function: dts | |
| # Cyclomatic Complexity: 10 | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # shellcheck disable=2034,2001 | |
| loggerx() { | |
| local MSG LOG RAW S C C_EMERGENCY C_ALERT C_CRITICAL \ | |
| C_ERROR C_WARNING C_NOTICE C_INFO C_DEBUG C_SUCCESS | |
| # Reverse lookup dict | |
| C_EMERGENCY='\e[01;30;41m' # EMERGENCY | |
| C_ALERT='\e[01;31;43m' # ALERT | |
| C_CRITICAL='\e[01;97;41m' # CRITICAL | |
| C_ERROR='\e[01;31m' # ERROR | |
| C_WARNING='\e[01;33m' # WARNING | |
| C_NOTICE='\e[01;30;107m' # NOTICE | |
| C_INFO='\e[01;39m' # INFO | |
| C_DEBUG='\e[01;97;46m' # DEBUG | |
| C_SUCCESS='\e[01;32m' # SUCCESS | |
| # Color lookup & spacing | |
| case $1 in | |
| "EMERGENCY") C="C_${1}"; S=$(printf "%-39s" '') ;; | |
| "ALERT") C="C_${1}"; S=$(printf "%-35s" '') ;; | |
| "CRITICAL") C="C_${1}"; S=$(printf "%-38s" '') ;; | |
| "ERROR") C="C_${1}"; S=$(printf "%-35s" '') ;; | |
| "WARNING") C="C_${1}"; S=$(printf "%-37s" '') ;; | |
| "NOTICE") C="C_${1}"; S=$(printf "%-36s" '') ;; | |
| "INFO") C="C_${1}"; S=$(printf "%-34s" '') ;; | |
| "DEBUG") C="C_${1}"; S=$(printf "%-35s" '') ;; | |
| "SUCCESS") C="C_${1}"; S=$(printf "%-37s" '') ;; | |
| *) loggerx ERROR "Invalid log level: '$1'!" | |
| return 1 ;; | |
| esac | |
| # Final formatting | |
| MSG=$(printf '%b' "$(dts) ${!C}${1}\e[0m: $(sed 's/^ \+//g'<<<"${*:2}")") | |
| LOG=$(sed -z 's/\n$//g'<<<"${MSG}" | sed -z "s/\n/\n${S}/g") | |
| RAW="$THIS_SCRIPT ${1}: $(sed 's/ */ /g'<<<"${*:2}")" | |
| # Main Operation | |
| if [[ "$LOG_TO_FILE" == "true" ]]; then | |
| echo "$LOG" | tee -a "$LOG_FILE" | |
| else | |
| echo "$LOG" | |
| fi | |
| echo "$RAW" | logger | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # 'et' (echo task) and 'rc' (result check) provide a simple | |
| # and consistent method of exit code validation and logging. | |
| # Arguments: | |
| # $TASK | |
| # Depends On: | |
| # - function: loggerx | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| et() { loggerx INFO "START: $TASK..."; } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # 'et' (echo task) and 'rc' (result check) provide a simple | |
| # and consistent method of exit code validation and logging. | |
| # Arguments: | |
| # - $1 The expected exit code. | |
| # - $2 If KILL is passed then exit with passed exit | |
| # code. | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| rc() { | |
| local EXIT_CODE=$? | |
| if [[ "$1" -eq "$EXIT_CODE" ]] ; then | |
| loggerx SUCCESS "$TASK." | |
| else | |
| loggerx ERROR "$TASK (exit code: $EXIT_CODE -- expected code: $1)" | |
| if [[ "$2" == "KILL" ]]; then | |
| # If function, then return | |
| if [[ "${FUNCNAME[*]:1}" != "" ]] && [[ "${FUNCNAME[-1]}" != "main" ]]; then | |
| return "$EXIT_CODE" | |
| fi | |
| exit "$EXIT_CODE" | |
| fi | |
| fi | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Check remote version of this file for updates. | |
| # - Rate limited to once every 10 seconds. | |
| # - Caches result to ensure meaningful content on every exec. | |
| # - Does not auto-update. User must update with | |
| # __update_bashrc | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| __check_for_bashrc_user_gist_update() { | |
| local CADENCE_FILE LOCAL_VERSION LOCAL_FILE REMOTE_VERSION REMOTE_FILE_URL CACHED_RESULT_FILE | |
| LOCAL_FILE="$HOME/.bashrc_user_gist" | |
| REMOTE_FILE_URL="https://gist.githubusercontent.com/AlexAtkinson/bc765a0c143ab2bba69a738955d90abd/raw/.bashrc" | |
| CADENCE_FILE="/tmp/${USER}_bashrc_version_check_timer" | |
| [[ ! -f "$CADENCE_FILE" ]] && touch "$CADENCE_FILE" | |
| CACHED_RESULT_FILE="/tmp/${USER}_bashrc_version_cached_result" | |
| [[ ! -f "$CACHED_RESULT_FILE" ]] && echo 'init' > "$CACHED_RESULT_FILE" | |
| # Rate limiting | |
| # Exit if within cadence period and cached result was false | |
| [[ $(( $(date +%s) - $(stat "$CADENCE_FILE" -c %Y) )) -le 10 ]] && [[ ! -s "$CACHED_RESULT_FILE" ]] && return 0 | |
| if [[ $(( $(date +%s) - $(stat "$CADENCE_FILE" -c %Y) )) -le 10 ]] && [[ -s "$CACHED_RESULT_FILE" ]]; then | |
| loggerx ERROR "Rate limit exceeded. Outputting cached result. | |
| Try again in $(( 10 - ($(date +%s) - $(stat "$CADENCE_FILE" -c %Y)) )) seconds." | |
| cat "$CACHED_RESULT_FILE" | |
| return 0 | |
| fi | |
| LOCAL_VERSION=$(grep -m1 '^# VERSION' "$LOCAL_FILE" | cut -d: -f2-) | |
| REMOTE_VERSION=$(curl -sS -r 0-400 "$REMOTE_FILE_URL" | grep -m1 '^# VERSION' | cut -d: -f2-) | |
| if [[ "$LOCAL_VERSION" != "$REMOTE_VERSION" ]]; then | |
| loggerx NOTICE ".bashrc_user_gist update available. Local: $LOCAL_VERSION | Remote: $REMOTE_VERSION." | \ | |
| tee "$CACHED_RESULT_FILE" | |
| touch "$CADENCE_FILE" | |
| return 0 | |
| fi | |
| truncate -s 0 "$CACHED_RESULT_FILE" | |
| } | |
| __check_for_bashrc_user_gist_update | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # History | |
| # Notes: | |
| # - Create a separate history file per session | |
| # - Load ALL previous history for each new session | |
| # - Commit each command to history immediately | |
| # Implementation: | |
| # If introducing this to an existing system, the original | |
| # HISTFILE can be preserved for use with `history` by | |
| # copying it to ~/.history/. IE: | |
| # copy ~/.bash_history ~/.history/history_orig.hist | |
| # TODO: | |
| # - Add cron to monitor/clean ~/.history | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| shopt -s histappend | |
| [[ -d ~/.history ]] || mkdir --mode=0700 ~/.history | |
| [[ -d ~/.history ]] && chmod 0700 ~/.history | |
| touch "$HOME/.history/history.$(date --utc +'%Y-%m-%dT%H-%M-%SZ').$$.hist" | |
| HISTFILE="$HOME/.history/history.$(date --utc +'%Y-%m-%dT%H-%M-%SZ').$$.hist" | |
| HISTTIMEFORMAT="%FT%T " | |
| HISTFILESIZE=20480 | |
| HISTSIZE=2048 | |
| # Load all previous history files | |
| # TODO: Optimize to avoid loading huge histories repeatedly. | |
| for HISTFILE in ~/.history/history.*.hist; do | |
| history -r "$HISTFILE" | |
| done | |
| grep -q 'history -a' <<< "$PROMPT_COMMAND" || export PROMPT_COMMAND="history -a; $PROMPT_COMMAND" | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Prompt - Exit Code Indicator | |
| # REF : https://www.rubydoc.info/gems/rb-readline/0.5.4/RbReadline | |
| # 243 /* Current implementation: | |
| # 244 \001 (^A) start non-visible characters | |
| # 245 \002 (^B) end non-visible characters | |
| # 246 all characters except \001 and \002 (following a \001) are copied to | |
| # 247 the returned string; all characters except those between \001 and | |
| # 248 \002 are assumed to be `visible'. */ | |
| # | |
| # NOTES : - \[ and \] translate to \001 and \002 in bash | |
| # - 'uX97w' (random string) is used below as variable key to mitigate risk of collision with user actions. | |
| # - Escape issues appears to be a bug where $- doesn't contain 'i' as required by /etc/profile.d/vte*.sh | |
| # Once this bug is resolved, then the 5.3 version can be used. | |
| # A bug-report has been filed. | |
| # | |
| # WARNING : If you use bash's printf or echo -e, and if your text has \001 or \002 | |
| # immediately before a number, you'll hit a bash bug that causes it to eat | |
| # one digit too many when processing octal escapes – that is, \00142 will | |
| # be interpreted as octal 014 (followed by ASCII "2"), instead of the | |
| # correct octal 01 (followed by ASCII "42"). For this reason, use | |
| # hexadecimal versions \x01 and \x02 instead. | |
| # | |
| # Chars : ⛳ 🖵 🎱 🟩 🟥 ☠ 💀 ⟫ | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Bash Version < 5.3 | |
| # shellcheck disable=2181 | |
| __exit_symbol_52() { | |
| [[ $? == 0 ]] && echo -n "⛳⟫" | |
| [[ $? != 0 ]] && echo -n "💀⟫" | |
| } | |
| # Bash Version >= 5.3 | |
| __exit_symbol_53() { | |
| local EXIT_CODE PROMPT | |
| EXIT_CODE="$?" | |
| PROMPT="X" | |
| [[ $EXIT_CODE -eq 0 ]] && local PROMPT="🟩" | |
| [[ $EXIT_CODE -ne 0 ]] && local PROMPT="🟥" | |
| [[ $HISTCMD -eq $PS1_HISTCMD ]] && local PROMPT="🎱" | |
| printf "%s" "$PROMPT" | |
| } | |
| # shellcheck disable=2154 | |
| if [[ "$color_prompt" = "yes" ]]; then | |
| # Bash Version < 5.3 | |
| PS1='${debian_chroot:+($debian_chroot)}${uX97w[\#]-$(__exit_symbol_52)}${uX97w[\#]+🎱⟫}${uX97w[\#]=}\[\033[01;32m\]\u@\h\[\033[00m\]:\[\033[01;34m\]\w\[\033[00m\]\$ ' | |
| # Bash Version >= 5.3 | |
| # PS1='$(__exit_symbol_53)${|PS1_HISTCMD=$HISTCMD;}⟫${debian_chroot:+($debian_chroot)}\[\033[01;32m\]\u@\h\[\033[00m\]:\[\033[01;34m\]\w\[\033[00m\]$ ' | |
| else | |
| PS1='${debian_chroot:+($debian_chroot)}\u@\h:\w\$ ' | |
| fi | |
| unset color_prompt force_color_prompt | |
| case "$TERM" in | |
| tmux*|xterm*|rxvt*|screen) | |
| # Bash Version < 5.3 | |
| PS1='${debian_chroot:+($debian_chroot)}${uX97w[\#]-$(__exit_symbol_52)}${uX97w[\#]+🎱⟫}${uX97w[\#]=}\[\033[01;32m\]\u@\h\[\033[00m\]:\[\033[01;34m\]\w\[\033[00m\]\$ ' | |
| # Bash Version >= 5.3 # TODO: Verify escapes >>> It's a bug with $- | |
| #PS1='$(__exit_symbol_53)${|PS1_HISTCMD=$HISTCMD;}⟫${debian_chroot:+($debian_chroot)}\[\033[01;32m\]\u@\h\[\033[00m\]:\[\033[01;34m\]\w\[\033[00m\]$ ' | |
| ;; | |
| *) | |
| ;; | |
| esac | |
| PROMPT_DIRTRIM=2 | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # ██████╗ █████╗ ████████╗██╗ ██╗ ██╗ ████████╗ ██████╗ ██████╗ ██╗ ██╗███╗ ██╗ ██████╗ | |
| # ██╔══██╗██╔══██╗╚══██╔══╝██║ ██║ ██║ ╚══██╔══╝██╔═══██╗██╔═══██╗██║ ██║████╗ ██║██╔════╝ | |
| # ██████╔╝███████║ ██║ ███████║ ████████╗ ██║ ██║ ██║██║ ██║██║ ██║██╔██╗ ██║██║ ███╗ | |
| # ██╔═══╝ ██╔══██║ ██║ ██╔══██║ ██╔═██╔═╝ ██║ ██║ ██║██║ ██║██║ ██║██║╚██╗██║██║ ██║ | |
| # ██║ ██║ ██║ ██║ ██║ ██║ ██████║ ██║ ╚██████╔╝╚██████╔╝███████╗██║██║ ╚████║╚██████╔╝ | |
| # ╚═╝ ╚═╝ ╚═╝ ╚═╝ ╚═╝ ╚═╝ ╚═════╝ ╚═╝ ╚═════╝ ╚═════╝ ╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝ | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Node Version Manager | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| export NVM_DIR="$HOME/.nvm" | |
| # shellcheck disable=SC1091 | |
| [ -s "$NVM_DIR/nvm.sh" ] && \. "$NVM_DIR/nvm.sh" # This loads nvm | |
| # shellcheck disable=SC1091 | |
| [ -s "$NVM_DIR/bash_completion" ] && \. "$NVM_DIR/bash_completion" # This loads nvm bash_completion | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Go | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # shellcheck disable=SC1090 | |
| [[ -s "$HOME/.cargo/env" ]] && . ~/.cargo/env | |
| export GOPATH=$HOME/go | |
| export GOBIN=$HOME/go/bin | |
| export PATH="$PATH:/usr/local/go/bin" | |
| export PATH="$PATH:/home/alex/go/bin" | |
| export PATH="$PATH:/usr/libexec/docker/cli-plugins/" | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # ███████╗ █████╗ ███╗ ██╗██╗████████╗██╗███████╗███████╗ | |
| # ██╔════╝██╔══██╗████╗ ██║██║╚══██╔══╝██║██╔════╝██╔════╝ | |
| # ███████╗███████║██╔██╗ ██║██║ ██║ ██║█████╗ ███████╗ | |
| # ╚════██║██╔══██║██║╚██╗██║██║ ██║ ██║██╔══╝ ╚════██║ | |
| # ███████║██║ ██║██║ ╚████║██║ ██║ ██║███████╗███████║ | |
| # ╚══════╝╚═╝ ╚═╝╚═╝ ╚═══╝╚═╝ ╚═╝ ╚═╝╚══════╝╚══════╝ | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Director/File permissions | |
| # TODO: Prompt auto-correct. | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| __permissions_checks() { | |
| # shellcheck disable=SC2016 | |
| declare -rA permissions_dict=( | |
| ['$HOME/.ssh']="700" # rwx------ | |
| ['$HOME/.ssh/id_*.pub']="644" # rw-r--r-- | |
| ['$HOME/.ssh/id_*[!.pub]']="600" # rw------- | |
| ['$HOME/.ssh/authorized_keys']="600" # rw------- | |
| ['$HOME/.ssh/config']="600" # rw------- | |
| ) | |
| # shellcheck disable=SC2068 | |
| for i in ${!permissions_dict[@]}; do | |
| if [[ -e $(eval echo "$i") ]]; then | |
| if [[ "$(stat -c "%a" "$(eval echo "$i")")" != "${permissions_dict[$i]}" ]]; then | |
| loggerx WARNING "Permissions for '$(eval echo "$i")' ($(stat -c "%a" "$(eval echo "$i")")) are incorrect. Recommended: ${permissions_dict[$i]}." | |
| fi | |
| fi | |
| done | |
| } | |
| __permissions_checks | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Directory Assurance | |
| # - Ensures required directories exist. | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| __directory_assurance() { | |
| local DIRS=( | |
| "$HOME/.iptables" | |
| ) | |
| for DIR in "${DIRS[@]}"; do | |
| [[ ! -d "$DIR" ]] && mkdir -p "$DIR" | |
| done | |
| } | |
| __directory_assurance | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # ██████╗ ███████╗ ██████╗ ███████╗██╗ ██╗ ██████╗ ██╗ ██████╗████████╗ ██╗ ██╗ █████╗ ██████╗ ███████╗ | |
| # ██╔══██╗██╔════╝██╔════╝ ██╔════╝╚██╗██╔╝ ██╔══██╗██║██╔════╝╚══██╔══╝ ██║ ██║██╔══██╗██╔══██╗██╔════╝ | |
| # ██████╔╝█████╗ ██║ ███╗█████╗ ╚███╔╝ ██║ ██║██║██║ ██║ ██║ ██║███████║██████╔╝███████╗ | |
| # ██╔══██╗██╔══╝ ██║ ██║██╔══╝ ██╔██╗ ██║ ██║██║██║ ██║ ╚██╗ ██╔╝██╔══██║██╔══██╗╚════██║ | |
| # ██║ ██║███████╗╚██████╔╝███████╗██╔╝ ██╗▄█╗ ██████╔╝██║╚██████╗ ██║▄█╗ ╚████╔╝ ██║ ██║██║ ██║███████║ | |
| # ╚═╝ ╚═╝╚══════╝ ╚═════╝ ╚══════╝╚═╝ ╚═╝╚═╝ ╚═════╝ ╚═╝ ╚═════╝ ╚═╝╚═╝ ╚═══╝ ╚═╝ ╚═╝╚═╝ ╚═╝╚══════╝ | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Context | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| export SYSTEMD_EDITOR=vim # Change default systemctl editor | |
| export EDITOR=vim # Change editor to VIM | |
| alias sudo='sudo ' # Preserve aliases with sudo | |
| alias visudo='sudo EDITOR=vim visudo' # Change visudo editor to VIM | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # REGEX | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # NOTE: Some of these will require the -P flag if used with grep. | |
| export REGEX_INTEGER='^[0-9]+$' | |
| export REGEX_FLOAT='^[0-9]+\.[0-9]+$' | |
| export REGEX_SEMVER="[v]?(0|[1-9][0-9]*)\\.(0|[1-9][0-9]*)\\.(0|[1-9][0-9]*)" | |
| export REGEX_URL='(https?:\/\/)?(www\.)?[-a-zA-Z0-9@:%._\+~#=]{2,256}\.[a-z]{2,6}\b([-a-zA-Z0-9@:%_\+.~#?&//=]*)' | |
| export REGEX_IP='((^\s*((([0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])\.){3}([0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5]))\s*$)|(^\s*((([0-9A-Fa-f]{1,4}:){7}([0-9A-Fa-f]{1,4}|:))|(([0-9A-Fa-f]{1,4}:){6}(:[0-9A-Fa-f]{1,4}|((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3})|:))|(([0-9A-Fa-f]{1,4}:){5}(((:[0-9A-Fa-f]{1,4}){1,2})|:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3})|:))|(([0-9A-Fa-f]{1,4}:){4}(((:[0-9A-Fa-f]{1,4}){1,3})|((:[0-9A-Fa-f]{1,4})?:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){3}(((:[0-9A-Fa-f]{1,4}){1,4})|((:[0-9A-Fa-f]{1,4}){0,2}:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){2}(((:[0-9A-Fa-f]{1,4}){1,5})|((:[0-9A-Fa-f]{1,4}){0,3}:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){1}(((:[0-9A-Fa-f]{1,4}){1,6})|((:[0-9A-Fa-f]{1,4}){0,4}:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(:(((:[0-9A-Fa-f]{1,4}){1,7})|((:[0-9A-Fa-f]{1,4}){0,5}:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:)))(%.+)?\s*$))' | |
| export REGEX_IPV4='(([0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])\.){3}([0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])' | |
| export REGEX_IPV6='(([0-9a-fA-F]{1,4}:){7,7}[0-9a-fA-F]{1,4}|([0-9a-fA-F]{1,4}:){1,7}:|([0-9a-fA-F]{1,4}:){1,6}:[0-9a-fA-F]{1,4}|([0-9a-fA-F]{1,4}:){1,5}(:[0-9a-fA-F]{1,4}){1,2}|([0-9a-fA-F]{1,4}:){1,4}(:[0-9a-fA-F]{1,4}){1,3}|([0-9a-fA-F]{1,4}:){1,3}(:[0-9a-fA-F]{1,4}){1,4}|([0-9a-fA-F]{1,4}:){1,2}(:[0-9a-fA-F]{1,4}){1,5}|[0-9a-fA-F]{1,4}:((:[0-9a-fA-F]{1,4}){1,6})|:((:[0-9a-fA-F]{1,4}){1,7}|:)|fe80:(:[0-9a-fA-F]{0,4}){0,4}%[0-9a-zA-Z]{1,}|::(ffff(:0{1,4}){0,1}:){0,1}((25[0-5]|(2[0-4]|1{0,1}[0-9]){0,1}[0-9])\.){3,3}(25[0-5]|(2[0-4]|1{0,1}[0-9]){0,1}[0-9])|([0-9a-fA-F]{1,4}:){1,4}:((25[0-5]|(2[0-4]|1{0,1}[0-9]){0,1}[0-9])\.){3,3}(25[0-5]|(2[0-4]|1{0,1}[0-9]){0,1}[0-9]))' | |
| export REGEX_IPV4_IPV6='((^\s*((([0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])\.){3}([0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5]))\s*$)|(^\s*((([0-9A-Fa-f]{1,4}:){7}([0-9A-Fa-f]{1,4}|:))|(([0-9A-Fa-f]{1,4}:){6}(:[0-9A-Fa-f]{1,4}|((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3})|:))|(([0-9A-Fa-f]{1,4}:){5}(((:[0-9A-Fa-f]{1,4}){1,2})|:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3})|:))|(([0-9A-Fa-f]{1,4}:){4}(((:[0-9A-Fa-f]{1,4}){1,3})|((:[0-9A-Fa-f]{1,4})?:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){3}(((:[0-9A-Fa-f]{1,4}){1,4})|((:[0-9A-Fa-f]{1,4}){0,2}:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){2}(((:[0-9A-Fa-f]{1,4}){1,5})|((:[0-9A-Fa-f]{1,4}){0,3}:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){1}(((:[0-9A-Fa-f]{1,4}){1,6})|((:[0-9A-Fa-f]{1,4}){0,4}:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(:(((:[0-9A-Fa-f]{1,4}){1,7})|((:[0-9A-Fa-f]{1,4}){0,5}:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:)))(%.+)?\s*$))' | |
| export REGEX_IP_PRIVATE='10(?:\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)){3}|172\.(?:1[6-9]|2[0-9]|3[01])(?:\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)){2}|192\.168(?:\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)){2}|127(?:\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)){3}|169\.254(?:\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)){2}' | |
| export REGEX_IPV4_PRIVATE='(10|127|169\.254|172\.1[6-9]|172\.2[0-9]|172\.3[0-1]|192\.168)\.' | |
| export REGEX_IPv6_PRIVATE='(^::1$)|(^[fF][cCdD])' | |
| export REGEX_EMAIL='[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}' | |
| export REGEX_DUPES='(\b\w+\b)(?=.*\b\1\b)' | |
| export REGEX_HEX_COLOR='#?([a-fA-F0-9]{6}|[a-fA-F0-9]{3})\b' | |
| export REGEX_DATE_ISO8601='([0-9]{4})-(0[1-9]|1[0-2])-(0[1-9]|[12][0-9]|3[01])' | |
| export REGEX_TIME_24H='([01][0-9]|2[0-3]):[0-5][0-9](:[0-5][0-9])?' | |
| export REGEX_UUID='[0-9a-fA-F]{8}-[0-9a-fA-F]{4}-[0-9a-fA-F]{4}-[0-9a-fA-F]{4}-[0-9a-fA-F]{12}' | |
| export REGEX_MAC_ADDRESS='([0-9A-Fa-f]{2}[:-]){5}([0-9A-Fa-f]{2})' | |
| export REGEX_US_PHONE='\(?\b[2-9][0-9]{2}\)?[-.\s]?[2-9][0-9]{2}[-.\s]?[0-9]{4}\b' | |
| export REGEX_CREDIT_CARD='\b(?:4[0-9]{12}(?:[0-9]{3})?|5[1-5][0-9]{14}|3[47][0-9]{13}|3(?:0[0-5]|[68][0-9])[0-9]{11}|6(?:011|5[0-9]{2})[0-9]{12}|(?:2131|1800|35\d{3})\d{11})\b' | |
| export REGEX_POSTAL_CODE_US='\b\d{5}(-\d{4})?\b' | |
| export REGEX_POSTAL_CODE_CA='\b[ABCEGHJ-NPRSTVXY]\d[ABCEGHJ-NPRSTV-Z] ?\d[ABCEGHJ-NPRSTV-Z]\d\b' | |
| export REGEX_POSTAL_CODE_UK='\b([A-Z]{1,2}\d[A-Z\d]? \d[ABD-HJLNP-UW-Z]{2}|GIR 0AA)\b' | |
| export REGEX_HTML_TAG='<([a-zA-Z][a-zA-Z0-9]*)\b[^>]*>(.*?)<\/\1>' | |
| export REGEX_WHITESPACE='^[[:space:]]+$' | |
| export REGEX_PASSWORD_STRONG='^(?=.*[a-z])(?=.*[A-Z])(?=.*\d)(?=.*[@$!%*?&])[A-Za-z\d@$!%*?&]{8,}$' | |
| export REGEX_BASE64='^(?:[A-Za-z0-9+/]{4})*(?:[A-Za-z0-9+/]{2}==|[A-Za-z0-9+/]{3}=)?$' | |
| export REGEX_BINARY='^[01]+$' | |
| export REGEX_HEX='^0x[0-9a-fA-F]+$' | |
| export REGEX_OCTAL='^0[0-7]+$' | |
| export REGEX_NUMERIC='^-?[0-9]+(\.[0-9]+)?$' | |
| export REGEX_ALPHANUMERIC='^[a-zA-Z0-9]+$' | |
| export REGEX_UUID_V4='[0-9a-fA-F]{8}-[0-9a-fA-F]{4}-4[0-9a-fA-F]{3}-[89abAB][0-9a-fA-F]{3}-[0-9a-fA-F]{12}' | |
| export REGEX_YOUTUBE_URL='^(https?\:\/\/)?(www\.youtube\.com|youtu\.be)\/.+$' | |
| export REGEX_TWITTER_HANDLE='^@?(\w){1,15}$' | |
| export REGEX_GITHUB_USERNAME='^(?!-)(?!.*--)[a-zA-Z0-9-]{1,39}(?<!-)$' | |
| export REGEX_HTML_ENTITY='&[a-zA-Z]+;|&#[0-9]+;|&#x[0-9a-fA-F]+;' | |
| export REGEX_UNIX_PATH='^(\/[^\/\0]+)+\/?$' | |
| export REGEX_WINDOWS_PATH='^[a-zA-Z]:\\(?:[^\\\/\:\*\?"<>\|]+\\)*[^\\\/\:\*\?"<>\|]*$' | |
| export REGEX_ISBN_10='^(?:\d{9}X|\d{10})$' | |
| export REGEX_ISBN_13='^(?:\d{13})$' | |
| export REGEX_LATITUDE='^(-?([1-8]?[0-9](\.\d+)?|90(\.0+)?))$' | |
| export REGEX_LONGITUDE='^(-?((1[0-7][0-9](\.\d+)?|[1-9]?[0-9](\.\d+)?|180(\.0+)?)))$' | |
| export REGEX_COORDINATES='^(-?([1-8]?[0-9](\.\d+)?|90(\.0+)?)),\s*(-?((1[0-7][0-9](\.\d+)?|[1-9]?[0-9](\.\d+)?|180(\.0+)?)))$' | |
| export REGEX_HTML_COMMENT='<!--(.*?)-->' | |
| export REGEX_CSS_CLASS='\.([a-zA-Z_-][a-zA-Z0-9_-]*)\s*\{' | |
| export REGEX_CSS_ID='#([a-zA-Z_-][a-zA-Z0-9_-]*)\s*\{' | |
| export REGEX_XML_TAG='<([a-zA-Z_][\w\.-]*)(\s+[a-zA-Z_][\w\.-]*="[^"]*")*\s*(\/?)>' | |
| export REGEX_YAML_KEY='^([a-zA-Z0-9_-]+):' | |
| export REGEX_JSON_KEY='"([a-zA-Z0-9_-]+)"\s*:' | |
| export REGEX_MARKDOWN_HEADER='^(#{1,6})\s+(.+)$' | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Color Helper Vars | |
| # Notes: | |
| # - Foreground assumed -- BG denotes Background. | |
| # - ANSI Code Implementation | |
| # - Codes are not positional | |
| # - Codes may be layered | |
| # - Not all codes are compatible | |
| # - 1-9 turn ON a style, 21-29 turn OFF a style | |
| # - ASCII Extras | |
| # - 10-19: Font | |
| # WARNING: | |
| # - User terminal display settings affect these. | |
| # Usage: | |
| # - echo -e "${_C_BL}${_C_YELLOW}${_C_B}${_C_BG_CYAN}HELLO${_C}" | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| _C='\e[0m' # Reset | |
| _C_RESET='\e[0m' | |
| # Styles ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| _C_B='\e[1m' # Bold | |
| _C_D='\e[2m' # Dim | |
| _C_I='\e[3m' # Italic | |
| _C_U='\e[4m' # Underline | |
| _C_BL='\e[5m' # Blink | |
| _C_BLS='\e[5m' # Blink Slow | |
| _C_BLF='\e[6m' # Blink Fast Support varies by OS | |
| _C_REV='\e[7m' # Reverse FG/BG Colors | |
| _C_CON='\e[8m' # Conceal Support varies by OS | |
| _C_ST='\e[9m' # Strike-through | |
| # Note: 11-20 are alternative fonts. | |
| _C_N_B='\e[21m' # NOT Bold | |
| _C_N_D='\e[22m' # NOT Dim | |
| _C_N_I='\e[23m' # NOT Italic | |
| _C_N_U='\e[24m' # NOT Underline | |
| _C_N_BL='\e[25m' # NOT Blink | |
| _C_N_BLS='\e[25m' # NOT Blink Slow | |
| _C_N_BLF='\e[26m' # NOT Blink Fast Support varies by OS | |
| _C_N_REV='\e[27m' # NOT Reverse FG/BG Colors | |
| _C_N_CON='\e[28m' # NOT Conceal Support varies by OS | |
| _C_N_ST='\e[29m' # NOT Strike-through Support varies by OS | |
| # Colors ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| _C_BLACK='\e[30m' | |
| _C_RED='\e[31m' | |
| _C_GREEN='\e[32m' | |
| _C_YELLOW='\e[33m' | |
| _C_BLUE='\e[34m' | |
| _C_PURPLE='\e[35m' | |
| _C_CYAN='\e[36m' | |
| _C_WHITE='\e[37m' | |
| _C_256='\e[38;5;164m' # 38;2;<ascii code> | |
| _C_RGB='\e[38;2;3;252;7m' # 38;2;r;g;b # RGB support only in true-color terminals. | |
| # Background ~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| _C_BG='\e[40m' | |
| _C_BG_BLACK='\e[40m' | |
| _C_BG_RED='\e[41m' | |
| _C_BG_GREEN='\e[42m' | |
| _C_BG_YELLOW='\e[43m' | |
| _C_BG_BLUE='\e[44m' | |
| _C_BG_PURPLE='\e[45m' | |
| _C_BG_CYAN='\e[46m' | |
| _C_BG_WHITE='\e[47m' | |
| _C_BG_256='\e[48;164m' # 48;2;<ascii code> | |
| _C_BG_RGB='\e[48;2;3;252;7m' # 48;2;r;g;b # RGB support only in true-color terminals. | |
| # High Intensity Foreground ~~~~~~~~~~~~ | |
| _C_HI_BLACK='\e[90m' | |
| _C_HI_RED='\e[91m' | |
| _C_HI_GREEN='\e[92m' | |
| _C_HI_YELLOW='\e[93m' | |
| _C_HI_BLUE='\e[94m' | |
| _C_HI_PURPLE='\e[95m' | |
| _C_HI_CYAN='\e[96m' | |
| _C_HI_WHITE='\e[97m' | |
| # High Intensity Background ~~~~~~~~~~~~ | |
| _C_HIBG_BLACK='\e[100m' | |
| _C_HIBG_RED='\e[101m' | |
| _C_HIBG_GREEN='\e[102m' | |
| _C_HIBG_YELLOW='\e[103m' | |
| _C_HIBG_BLUE='\e[104m' | |
| _C_HIBG_PURPLE='\e[105m' | |
| _C_HIBG_CYAN='\e[106m' | |
| _C_HIBG_WHITE='\e[107m' | |
| # Custom Combos ~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Error Codes | |
| _C_EMERGENCY='\e[01;30;41m' | |
| _C_ALERT='\e[01;31;43m' | |
| _C_CRITICAL='\e[01;97;41m' | |
| _C_ERROR='\e[01;31m' | |
| _C_WARNING='\e[01;33m' | |
| _C_NOTICE='\e[01;30;107m' | |
| _C_INFO='\e[01;39m' | |
| _C_DEBUG='\e[01;97;46m' | |
| _C_SUCCESS='\e[01;32m' | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # ███╗ ██╗███████╗████████╗██╗ ██╗ ██████╗ ██████╗ ██╗ ██╗ | |
| # ████╗ ██║██╔════╝╚══██╔══╝██║ ██║██╔═══██╗██╔══██╗██║ ██╔╝ | |
| # ██╔██╗ ██║█████╗ ██║ ██║ █╗ ██║██║ ██║██████╔╝█████╔╝ | |
| # ██║╚██╗██║██╔══╝ ██║ ██║███╗██║██║ ██║██╔══██╗██╔═██╗ | |
| # ██║ ╚████║███████╗ ██║ ╚███╔███╔╝╚██████╔╝██║ ██║██║ ██╗ | |
| # ╚═╝ ╚═══╝╚══════╝ ╚═╝ ╚══╝╚══╝ ╚═════╝ ╚═╝ ╚═╝╚═╝ ╚═╝ | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # IPTables Save Helper | |
| # Notes: | |
| # - Saves current iptables/ip6tables rules to | |
| # ~/.iptables/iptables.rules.<timestamp> | |
| # and | |
| # ~/.iptables/ip6tables.rules.<timestamp> | |
| # - Timestamp is in seconds since epoch. | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| _iptables_save () { | |
| local DIR; | |
| DIR="$HOME/.iptabels"; | |
| sudo iptables-save | sudo tee "${DIR}/iptables.rules.$(date +%s)" > /dev/null; | |
| sudo ip6tables-save | sudo tee "${DIR}/ip6tables.rules.$(date +%s)" > /dev/null | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # ███████╗███████╗ ██████╗██╗ ██╗██████╗ ██╗████████╗██╗ ██╗ | |
| # ██╔════╝██╔════╝██╔════╝██║ ██║██╔══██╗██║╚══██╔══╝╚██╗ ██╔╝ | |
| # ███████╗█████╗ ██║ ██║ ██║██████╔╝██║ ██║ ╚████╔╝ | |
| # ╚════██║██╔══╝ ██║ ██║ ██║██╔══██╗██║ ██║ ╚██╔╝ | |
| # ███████║███████╗╚██████╗╚██████╔╝██║ ██║██║ ██║ ██║ | |
| # ╚══════╝╚══════╝ ╚═════╝ ╚═════╝ ╚═╝ ╚═╝╚═╝ ╚═╝ ╚═╝ | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # AppArmor Profile Search Helper | |
| # Notes: | |
| # - Searches apparmor profiles for a given pattern. | |
| # - Prints the header line once followed by matching | |
| # indented lines. | |
| # - Usage: aa-search <pattern> | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| apparmor-search() { | |
| local PATTERN current_header header_printed | |
| PATTERN="$1" | |
| if [ -z "$PATTERN" ]; then | |
| echo "Usage: $0 <pattern>" | |
| exit 1 | |
| fi | |
| current_header="" | |
| header_printed="false" | |
| # shellcheck disable=2046 | |
| while IFS= read -r line; do | |
| # Check for header | |
| if [[ "$line" =~ ^[0-9].*(mode\.|defined\.)$ ]]; then | |
| current_header="$line" | |
| header_printed="false" | |
| continue | |
| fi | |
| # Check for an indented line that contains the pattern | |
| if [[ "$line" =~ ^[[:space:]]+ ]] && [[ "$line" == *"$PATTERN"* ]]; then | |
| if [ -n "$current_header" ]; then | |
| if [ "$header_printed" = "false" ]; then | |
| echo "$current_header" | |
| header_printed="true" | |
| fi | |
| echo "$line" | |
| fi | |
| fi | |
| done <<< $(sudo aa-status) | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # ███████╗██╗ ██╗███╗ ██╗ ██████╗████████╗ ██╗ █████╗ ██╗ ██╗ █████╗ ███████╗ | |
| # ██╔════╝██║ ██║████╗ ██║██╔════╝╚══██╔══╝ ██║ ██╔══██╗██║ ██║██╔══██╗██╔════╝ | |
| # █████╗ ██║ ██║██╔██╗ ██║██║ ██║ ████████╗ ███████║██║ ██║███████║███████╗ | |
| # ██╔══╝ ██║ ██║██║╚██╗██║██║ ██║ ██╔═██╔═╝ ██╔══██║██║ ██║██╔══██║╚════██║ | |
| # ██║ ╚██████╔╝██║ ╚████║╚██████╗ ██║ ██████║ ██║ ██║███████╗██║██║ ██║███████║ | |
| # ╚═╝ ╚═════╝ ╚═╝ ╚═══╝ ╚═════╝ ╚═╝ ╚═════╝ ╚═╝ ╚═╝╚══════╝╚═╝╚═╝ ╚═╝╚══════╝ | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Unicode Browser and Search | |
| # Launches an fzf-based Unicode character browser. | |
| # Dependencies: | |
| # - fzf | |
| # Notes: | |
| # - Uses Variation Selectors to display characters. | |
| # See: https://en.wikipedia.org/wiki/Variation_Selectors | |
| # - Use left/right arrow keys to change Variation Selector. | |
| # - NOTICE: Search is by hex codepoint. | |
| # Codes: | |
| # - 0028: braille pattern dots-1-2-3-4-5-6-7-8 | |
| # - FE00..FE0F: | |
| # Credit: https://stackoverflow.com/a/76256737/2676075 | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| unicode_browser() { | |
| local mod tmpfile res aList | |
| showU8 () { | |
| local _i _a _f _e _t | |
| printf -v _f '%*s' 16 '' | |
| _t=${_f// /--} _f="${_f// / %b}" | |
| printf -v _t '%s %s %s\n' "${_t::6}" "${_t:1}"{,} | |
| printf -v _e '\\U%X' $(($1>16?$1+917743:$1+65023)) | |
| _e="${_f//b/b$_e}" | |
| printf 'Show UTF8 table using: VARIATION SELECTOR-%d (U+%X)\n' \ | |
| "$1" $(($1>16?$1+917743:$1+65023)) | |
| shift | |
| for _a; do | |
| printf "U%03Xyx $_f $_e\n%s" 0x"${_a}" {,}{{0..9},{A..F}} "$_t" | |
| for _i in {0..9} {A..F}; do | |
| (( 16#$_a == 0 )) && (( ( 16#$_i & 7 ) < 2 )) && | |
| printf 'U%04Xx%68s\n' 0x"$_a$_i" '' && continue | |
| printf "U%04Xx $_f $_e\n" 0x"$_a$_i" \ | |
| "\\U$_a$_i"{,}{{0..9},{A..F}} | |
| done | |
| done | |
| } | |
| tmpfile=$(mktemp) ; echo "${mod:-16}" >"$tmpfile" | |
| aList=( {0..215} {249..282} {284..293} {303..308} {324..326} {360..363} \ | |
| {366..367} {392..396} {431..434} {444..444} {463..474} {479..482} {487..489}\ | |
| {492..494} {496..507} {512..747} {760..762} {768..787} {3584..3585} ) | |
| res="$( | |
| fzf --no-mouse -m < <(printf '%04X\n' "${aList[@]}") --preview \ | |
| "$(declare -f showU8);showU8 \$(<$tmpfile) {}" --preview-window 74 \ | |
| --bind "left:execute| mod=\$(<$tmpfile) ; echo > $tmpfile \$(( \ | |
| mod > 1 ? mod - 1 : 256 ))|+refresh-preview" \ | |
| --bind "right:execute| echo > $tmpfile \$(( mod=\$(<$tmpfile), \ | |
| mod < 256 ? mod + 1 : 1 ))|+refresh-preview" )" | |
| showU8 "$(<"$tmpfile")" "$res" | |
| rm "$tmpfile" | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Unicode Search | |
| # Searches Unicode characters by name. | |
| # Arguments: | |
| # - $1 Search term (case insensitive) | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| unicode_search() { | |
| python3 -c $'from unicodedata import name | |
| for i in range(0x10FFFF): | |
| try: | |
| var = name(chr(i)) | |
| except: | |
| var = None | |
| finally: | |
| if var: | |
| print("\\\\U%06X: \47%s\47 %s" % (i,chr(i),var))' | \ | |
| grep -i --color=auto "$1" | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Working Indicator | |
| # Displays an animated working indicator in the terminal. | |
| # Arguments: | |
| # - $@ Array of symbols to use for animation. | |
| # If none provided, defaults to a set of | |
| # five symbols. | |
| # - spinner If 'spinner' is provided as an argument, | |
| # the symbols will spin in a circular | |
| # manner. | |
| # Usage: | |
| # __working ⚪ 🟡 🟠 🟢 🟤 | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| __working() { | |
| local args="$*" | |
| local symbols=( ${args} ) | |
| [[ "$args" =~ "spinner" ]] && local spin='true' | |
| # shellcheck disable=2206 | |
| [[ "$args" =~ "spinner" ]] && local symbols=( ${args//spinner/} ) | |
| [[ $# -le 1 ]] && declare -a symbols=("𝍠" "𝍡" "𝍢" "𝍣" "𝍤") | |
| local direction='up' | |
| while true; do | |
| for ((s=0; s<${#symbols[@]}; s++)); do | |
| printf "\r%s Working..." "${symbols[$s]}" | |
| sleep 0.2 | |
| if [[ ! "$spin" == "true" ]]; then | |
| [[ "$s" -eq $((${#symbols[@]} - 1)) && "$direction" == "up" ]] && direction='down' | |
| [[ "$s" -eq 0 && "$direction" == "down" ]] && direction='up' | |
| [[ "$direction" == "down" ]] && s=$((s - 2)) | |
| fi | |
| done | |
| done | |
| printf "\r%s Done! \n" "✅" | |
| } | |
| # Aliases for common working indicators | |
| alias __working_suits='__working ♣️ ♦️ ♠️ ♥️ spinner' | |
| alias __working_colors='__working ⚪ 🟡 🟠 🟢 🟤 🔴 🔵 🟣 ⚫' | |
| alias __working_colors_square='__working ⬜ 🟨 🟧 🟩 🟫 🟥 🟦 🟪 ⬛' | |
| alias __working_moon_phases='__working 🌑 🌒 🌓 🌔 🌕 🌖 🌗 🌘 spinner' | |
| alias __working_dots='__working ⣾ ⣽ ⣻ ⢿ ⡿ ⣟ ⣯ ⣷ spinner' | |
| alias __working_clocks='__working 🕐 🕑 🕒 🕓 🕔 🕕 🕖 🕗 🕘 🕙 🕚 🕛 spinner' | |
| alias __working_breath='__working 🞅 🞆 🞇 🞈 🞉' | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Helper: Context Helper | |
| # Identifies current operating context. | |
| # Outputs: | |
| # Prints one of the following: | |
| # - terminal | |
| # - ssh | |
| # - sudo | |
| # - script | |
| # - function | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| __context() { | |
| local results=() | |
| [[ -n "$SSH_CLIENT" || -n "$SSH_TTY" ]] && results+=("ssh") | |
| [[ "$SUDO_USER" && "$SUDO_USER" != "$USER" ]] && results+=("sudo") | |
| [[ -n "$PS1" ]] && results+=("terminal") | |
| [[ "${BASH_SOURCE[1]}" == "$0" ]] && results+=("script") | |
| [[ -n "${FUNCNAME[1]}" ]] && results+=("function") | |
| echo "${results[@]}" | tr ' ' ',' | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Helper: Data Type | |
| # Arguments: | |
| # $1 Variable Name | |
| # Outputs: | |
| # Prints the data type of the variable. | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| __data_type() { | |
| [[ -z "$1" ]] && { loggerx ERROR "Data is required." >&2; return 1; } | |
| local data="$*" | |
| local result="string" | |
| grep -qE "$REGEX_INTEGER"<<<"$data" && result="integer" | |
| grep -qE "$REGEX_FLOAT"<<<"$data" && result="float" | |
| grep -qE "$REGEX_SEMVER"<<<"$data" && result="semver" | |
| grep -qE "$REGEX_URL"<<<"$data" && result="url" | |
| grep -qE "$REGEX_IPV4"<<<"$data" && result="ipv4" | |
| grep -qE "$REGEX_IPV6"<<<"$data" && result="ipv6" | |
| grep -qE "$REGEX_IP_PRIVATE"<<<"$data" && result="ip_private" | |
| grep -qE "$REGEX_EMAIL"<<<"$data" && result="email" | |
| echo "$result" | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Helper: Variable Type | |
| # Arguments: | |
| # $1 Variable Name | |
| # Outputs: | |
| # Prints the type of the variable. | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| __variable_type() { | |
| [[ -z "$1" ]] && { loggerx ERROR "Variable name is required." >&2; return 1; } | |
| local var_name="$1" | |
| local result | |
| local var_type | |
| result=$(declare -p "$var_name" 2>&1) | |
| case "$result" in | |
| declare\ -a*) var_type="array" ;; | |
| declare\ -A*) var_type="associative array" ;; | |
| declare\ -i*) var_type="integer" ;; | |
| declare\ -r*) var_type="readonly" ;; | |
| declare\ -x*) var_type="exported" ;; | |
| declare\ --*) var_type="string" ;; | |
| *not\ found*) var_type="undefined" ;; | |
| *) var_type="unknown" ;; | |
| esac | |
| echo "$var_type" | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Sanity: Is SUDO | |
| # Notes: | |
| # - Exits script if not run with sudo. | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| __is_sudo() { | |
| local term_cmd | |
| __context | grep -qE 'terminal' && term_cmd='return 1' || term_cmd='exit 1' | |
| [[ "$EUID" -ne 0 && "$1" == "KILL" ]] && { loggerx CRITICAL "This script MUST be run with sudo. Exiting."; $term_cmd; } | |
| [[ "$EUID" -ne 0 ]] && { loggerx WARNING "This script should be run with sudo."; } | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # DX friendly user prompts. | |
| # Arguments: | |
| # $1 Quoted prompt. IE: "Proceed?" | |
| # $2 Default Option. IE: Y | |
| # Outputs: | |
| # Exit Code matching response. | |
| # Examples: | |
| # - ask "Proceed?" Y | |
| # - while ask "Continue? " Y; do echo fail ; done | |
| # - if ask "$(echo -e 'Whazzzzzzup?'\\\n'1) woot'\\\n'2) rawr'\\\n'3) grrr'\\\n'0) quit'\\\n\\\n'Enter Response')" Range 0-3; then | |
| # answer="$reply" | |
| # else | |
| # answer="no" | |
| # fi | |
| # echo $answer | |
| # - if ask "Confirm: Stuff?" N; then | |
| # answer="yes" | |
| # else | |
| # answer="no" | |
| # fi | |
| # echo $answer | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| ask() { | |
| while true; do | |
| if [ "${2:-}" = "Y" ]; then | |
| prompt="Y/n" | |
| default=Y | |
| elif [ "${2:-}" = "N" ]; then | |
| prompt="y/N" | |
| default=N | |
| elif [ "${2:-}" = "Range" ]; then | |
| prompt="${3:-}" | |
| default=0 | |
| else | |
| prompt="y/n" | |
| default= | |
| fi | |
| read -rp $"$1 [$prompt]: " reply | |
| if [ -z "$reply" ]; then | |
| reply=$default | |
| fi | |
| case "$reply" in | |
| Y*|y*|^[1-9][0-9]*$) return 0 ;; | |
| N*|n*|0*) return 1 ;; | |
| esac | |
| done | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # DEMO: Execute sudo with an rc file. | |
| # Try: __sudo_say_hi | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| __sudo_user_do() { | |
| command sudo /bin/bash --rcfile /usr/local/share/sudo/bashrc -ci "$*" | |
| } | |
| # /usr/local/share/sudo/bashrc | |
| # __sudo_say_hi() { | |
| # printf "%s\n" "Hi." \ | |
| # "USER : $USER" \ | |
| # "SUDO_USER: $SUDO_USER" \ | |
| # "whoami: f$(whoami)" | |
| # } | |
| # | |
| # __sudo_user_trace() { | |
| # echo "${SUDO_USER:-$(pstree -Alsu "$1" | sed -n "s/.*(\([^)]*\)).*($USER)[^(]*$/\1/p")}" | |
| # } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Report the name of the OS distribution. | |
| # Outputs: | |
| # The NAME field from the /etc/*-release file | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| _check_os_distro() { | |
| DISTRO=$(awk -F= '$1=="NAME" { gsub(/"/,"",$2); print $2 }' /etc/*-release) | |
| export DISTRO | |
| echo "$DISTRO" | |
| } | |
| _check_os_distro >/dev/null >&1 | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Report the architecture of the system. | |
| # Outputs: | |
| # Print the system architecture as defined by `uname -m` | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| _check_os_arch() { | |
| case $(uname -m) in | |
| arm64|aarch64) ARCH="ARM64" ;; | |
| armhf|armv7*) ARCH="ARM32_COMPAT" ;; | |
| armv8*) ARCH="ARM64_COMPAT" ;; | |
| i*86*) ARCH="x86" ;; | |
| amd64|x86_64*) ARCH="x64" ;; | |
| *) ARCH="unknown: $ARCH" ;; | |
| esac | |
| export ARCH | |
| echo "$ARCH" | |
| } | |
| _check_os_arch >/dev/null >&1 | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Report the OS Type | |
| # Outputs: | |
| # The type of system as defined by the $OSTYPE variable. | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| _check_os_type() { | |
| case "$OSTYPE" in | |
| solaris*) OS_TYPE="SOLARIS" ;; | |
| darwin*) OS_TYPE="OSX" ;; | |
| linux*) OS_TYPE="LINUX" ;; | |
| bsd*) OS_TYPE="BSD" ;; | |
| msys*) OS_TYPE="WINDOWS" ;; | |
| cygwin*) OS_TYPE="CYGWIN" ;; | |
| *) OS_TYPE="unknown: $OSTYPE" ;; | |
| esac; | |
| export OS_TYPE | |
| echo "$OS_TYPE" | |
| } | |
| _check_os_type >/dev/null >&1 | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Report system information from /etc/os-release | |
| # Arguments & Outputs: | |
| # - See: https://www.freedesktop.org/software/systemd/man/os-release.html" | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| _check_os_info() { | |
| key="$*" | |
| function help { | |
| printf '%s\n' 'Try: NAME, PRETTY_NAME, VERSION_ID, VERSION, ID_LIKE' | |
| printf '%s\n' 'REF: https://www.freedesktop.org/software/systemd/man/os-release.html' | |
| return 0 | |
| } | |
| [[ $key == '-h' ]] && help | |
| [[ $# -eq 0 ]] && help | |
| for key in "$@"; do | |
| sed -ne "s/^$key=//p" /etc/os-release | tr -d '"' | |
| done | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Report Wireless Interface Name | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| _check_wireless_interface_name() { | |
| if command -v nmcli >/dev/null 2>&1; then | |
| nmcli -t -f DEVICE,TYPE device status 2>/dev/null \ | |
| | awk -F: '$2 == "wifi" { print $1; exit }' | |
| fi | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Report CPU Information | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| _check_cpu() { | |
| local LSCPU | |
| LSCPU=$(lscpu) | |
| printf "%s\n" "$(grep "Model name" <<< "$LSCPU" | \ | |
| cut -d: -f2- | \ | |
| sed -E 's/^[[:space:]]?+//') \ | |
| ($(( \ | |
| $(grep "Core(s)" <<< "$LSCPU" | \ | |
| cut -d: -f2) \ | |
| * \ | |
| $(grep "Thread(s)" <<< "$LSCPU" | \ | |
| cut -d: -f2) \ | |
| )) \ | |
| Threads @ ~$(grep "max MHz" <<< "$LSCPU" | \ | |
| cut -d: -f2 | \ | |
| sed -E 's/^[[:space:]]?+//' | \ | |
| cut -d. -f1) \ | |
| MHz max)" | sed 's/ */ /g' | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Report Memory Information | |
| # Note: Pre-formatting for other use. `free -m` is generally | |
| # fine. | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| _check_memory() { | |
| local MEMINFO TOTAL_MEM FREE_MEM USED_MEM BUFFERS_CACHED | |
| MEMINFO=$(grep -E '^MemTotal:|^MemFree:|^Buffers:|^Cached:' /proc/meminfo) | |
| TOTAL_MEM=$(grep 'MemTotal:' <<< "$MEMINFO" | awk '{print $2}') | |
| FREE_MEM=$(grep 'MemFree:' <<< "$MEMINFO" | awk '{print $2}') | |
| BUFFERS_CACHED=$(( $(grep 'Buffers:' <<< "$MEMINFO" | awk '{print $2}') + \ | |
| $(grep 'Cached:' <<< "$MEMINFO" | awk '{print $2}') )) | |
| USED_MEM=$(( TOTAL_MEM - FREE_MEM - BUFFERS_CACHED )) | |
| printf "Total: %d MB\n" $(( TOTAL_MEM / 1024 )) | |
| printf "Used : %d MB\n" $(( USED_MEM / 1024 )) | |
| printf "Free : %d MB\n" $(( FREE_MEM / 1024 )) | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Report total system memory. | |
| # Outputs: | |
| # Total system memory in human readable format. | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| _check_memory_installed() { | |
| local TOTAL_MEM | |
| TOTAL_MEM=$(lsmem -b --summary=only | sed -ne '/online/s/.* //p' | awk '{print $total}') | |
| # shellcheck disable=2119 | |
| echo "$TOTAL_MEM" | bytesToHumanReadable | |
| } | |
| # Reminder: | |
| # - IN GB without bytesToHuman...: | |
| # lsmem -b --summary=only | sed -ne '/online/s/.* //p' | awk '{print $total / 1024 / 1024 / 1024}' | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Report Total Installed Memory in MB | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| _check_memory_installed_mb() { | |
| local INSTALLED_MEM | |
| UNIT=$(sudo dmidecode -t memory | grep Size: | grep -v "No Module Installed" | awk 'NR==1 {print $NF; exit}') | |
| INSTALLED_MEM=$(sudo dmidecode -t memory | grep 'Size:' | grep -v 'No Module Installed' | awk '{sum += $2} END {print sum}') | |
| printf "%s\n" "$INSTALLED_MEM $UNIT" | \ | |
| awk '{ | |
| if ($2 == "GB") { | |
| printf "%d\n", $1 * 1024 | |
| } else if ($2 == "MB") { | |
| printf "%d\n", $1 | |
| } else { | |
| printf "0\n" | |
| } | |
| }' | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Report Total Installed Memory in GB | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| _check_memory_installed_gb() { | |
| local INSTALLED_MEM | |
| UNIT=$(sudo dmidecode -t memory | grep Size: | grep -v "No Module Installed" | awk 'NR==1 {print $NF; exit}') | |
| INSTALLED_MEM=$(sudo dmidecode -t memory | grep 'Size:' | grep -v 'No Module Installed' | awk '{sum += $2} END {print sum}') | |
| printf "%s\n" "$INSTALLED_MEM $UNIT" | \ | |
| awk '{ | |
| if ($2 == "GB") { | |
| printf "%d\n", $1 | |
| } else if ($2 == "MB") { | |
| printf "%d\n", int($1 / 1024) | |
| } else { | |
| printf "0\n" | |
| } | |
| }' | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Report Total Available Memory in MB | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| _check_memory_total_available_mb() { | |
| local TOTAL_MEM | |
| TOTAL_MEM=$(grep 'MemTotal:' /proc/meminfo | awk '{print $2}') | |
| printf "%d\n" $(( TOTAL_MEM / 1024 )) | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Report Total Available Memory in GB | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| _check_memory_total_available_gb() { | |
| local TOTAL_MEM | |
| TOTAL_MEM=$(grep 'MemTotal:' /proc/meminfo | awk '{print $2}') | |
| printf "%d\n" $(( TOTAL_MEM / 1024 / 1024 )) | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Report GPU Type | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| _check_gpu() { | |
| lspci -nn | grep 'VGA' | cut -d' ' -f6- | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Report GPU Driver Details | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| _check_gpu_driver() { | |
| local GPU | |
| GPU=$(escape_string <<<"$(_check_gpu)") | |
| lspci -nnk | grep -iE "$GPU" -A3 | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Generate System Report | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| _check_system_report() { | |
| printf "%s\n" "===== SYSTEM REPORT =====" | |
| printf "%s %s\n" "OS Distro :" "$(_check_os_distro)" | |
| printf "%s %s\n" "OS Type :" "$(_check_os_type)" | |
| printf "%s %s\n" "Architecture :" "$(_check_os_arch)" | |
| printf "%s %s\n" "CPU :" "$(_check_cpu)" | |
| printf "%s %s\n" "Memory (GB) :" "$(_check_memory_installed_gb)" | |
| printf "%s %s\n" "GPU :" "$(_check_gpu)" | |
| printf "%s %s\n" "GPU Driver :" "$(_check_gpu_driver | grep 'Kernel driver in use' | cut -d: -f2 | sed -E 's/^[[:space:]]+//')" | |
| printf "%s\n" "=========================" | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Report Longest IANA TLDs | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| _check_longest_iana_urls() { | |
| awk '{print length($0), $0; }'<<<"$(curl -sS https://data.iana.org/TLD/tlds-alpha-by-domain.txt)" | \ | |
| grep -v 'XN-' | \ | |
| sort -r -n | \ | |
| head | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # True Color Demo | |
| # Cyclomatic Complexity: 10 | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # shellcheck disable=SC2183 | |
| color_demo_truecolor() { | |
| # shellcheck disable=SC2183 | |
| local TXT FG COLUMN COLUMNS FILL_L FILL_R R G B | |
| TXT="This will be a smooth gradient if truecolor is supported." | |
| COLUMNS="${COLUMNS:-$(tput cols || 120)}" | |
| FILL_L="$(printf '%*s' "$(((COLUMNS - ${#TXT}) / 2))")" | |
| FILL_R="$(printf '%*s' "$(((COLUMNS - ${#TXT}) / 2))")" | |
| [[ $((${#TXT}%2)) -eq 1 ]] && FILL_R="$(printf '%*s' "$((((COLUMNS - ${#TXT}) / 2) +1 ))")" | |
| FG=$(printf '%s' "$FILL_L"; printf '%s' "$TXT"; printf '%s' "$FILL_R") | |
| for ((COLUMN=0; COLUMN<COLUMNS; COLUMN++)); do | |
| # Iterate RGB values ; Ensure int stays within range 0..255 | |
| ((R=255-(COLUMN*255/COLUMNS))); ((R<0))&&((R=255-(R+255))); ((R>255))&&((R=R-(R-255))) | |
| ((G=COLUMN*510/COLUMNS)) ; ((G<0))&&((G=255-(G+255))); ((G>255))&&((G=G-(G-255))) | |
| ((B=COLUMN*255/COLUMNS)) ; ((B<0))&&((B=255-(B+255))); ((B>255))&&((B=B-(B-255))) | |
| printf "\e[48;2;%d;%d;%dm" $R $G $B # BG | |
| printf "\e[38;2;%d;%d;%d;1m" $R 0 $B # FG Color | |
| printf "%s\e[0m" "${FG:${COLUMN}:1}" # FG Content | |
| #printf "%s\e[0m" " " # No FG Content | |
| done | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Color Demo | |
| # Iterates a range of 8-bit colors for demo purposes. | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| __color_demo() { | |
| [[ $1 -gt 256 ]] && loggerx WARNING "Max colors: 256" | |
| for i in $(seq 1 "$1"); do | |
| printf '\e[48;5;%sm %03d \e[0m' "$i" "$i" | |
| ! (( i % 8 )) && printf '\n' | |
| done | |
| } | |
| alias color_demo_8_bit='__color_demo 8' | |
| alias color_demo_256_bit='__color_demo 256' | |
| color_ansi_demo() { | |
| local STYLES=( | |
| "0:None" "1:Bold" "2:Dim" "3:Italic" "4:Underline" | |
| "5:Blink" "7:Inverse" "8:Hidden" "9:Strike" | |
| ) | |
| local FGS=( | |
| "30:Black" "31:Red" "32:Green" "33:Yellow" | |
| "34:Blue" "35:Magenta" "36:Cyan" "37:White" | |
| "90:Bright Black" "91:Bright Red" "92:Bright Green" "93:Bright Yellow" | |
| "94:Bright Blue" "95:Bright Magenta" "96:Bright Cyan" "97:Bright White" | |
| ) | |
| local BGS=( | |
| "40:Black" "41:Red" "42:Green" "43:Yellow" | |
| "44:Blue" "45:Magenta" "46:Cyan" "47:White" | |
| "100:Bright Black" "101:Bright Red" "102:Bright Green" "103:Bright Yellow" | |
| "104:Bright Blue" "105:Bright Magenta" "106:Bright Cyan" "107:Bright White" | |
| ) | |
| printf "\nANSI Attribute / Foreground / Background Demo\n" | |
| local STYLE FG BG STYLE_CODE STYLE_NAME FG_CODE FG_NAME BG_CODE | |
| for STYLE in "${STYLES[@]}"; do | |
| IFS=: read -r STYLE_CODE STYLE_NAME <<< "$STYLE" | |
| printf "\n[%s] %s\n" "$STYLE_CODE" "$STYLE_NAME" | |
| for FG in "${FGS[@]}"; do | |
| IFS=: read -r FG_CODE FG_NAME <<< "$FG" | |
| printf " FG %3s %-14s" "$FG_CODE" "$FG_NAME" | |
| for BG in "${BGS[@]}"; do | |
| IFS=: read -r BG_CODE <<< "$BG" | |
| printf "\e[%s;%s;%sm %s \e[0m" "$STYLE_CODE" "$FG_CODE" "$BG_CODE" "asdf" | |
| done | |
| printf "\e[0m\n" | |
| done | |
| done | |
| printf "\n" | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Timer: Start, Stop, Duration | |
| # Usage: | |
| # timer-start # Start timer | |
| # timer-duration # Get duration in seconds | |
| # timer-duration 8601 # Get duration in iso-8601-like format | |
| # timer-stop # Stop timer and get duration in seconds | |
| # timer-stop 8601 # Stop timer and get duration in | |
| # iso-8601-like format | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| timer-handler() { | |
| if [[ -n $timer_start ]]; then | |
| timer_now_s=$SECONDS | |
| timer_duration_s=$(( timer_now_s - timer_start )) | |
| else | |
| echo "timer not started" | |
| return 1 | |
| fi | |
| timer_output="$timer_duration_s" | |
| # days:hours:minutes:seconds | |
| if [[ "$1" == "8601" ]]; then | |
| [[ $(( timer_duration_s/3600/24 )) -eq 0 ]] && timer_output=$(date --utc -d "@$timer_duration_s" +"%H:%M:%S") | |
| [[ $(( timer_duration_s/3600/24 )) -gt 0 ]] && timer_output=$(date --utc -d "@$timer_duration_s" +"$(( timer_duration_s/3600/24 )):%H:%M:%S") | |
| fi | |
| } | |
| timer-start() { | |
| if [[ -n ${timer_start+x} ]]; then | |
| loggerx ERROR "Timer already started." | |
| return 1 | |
| fi | |
| timer_start=$SECONDS | |
| } | |
| timer-duration() { | |
| timer-handler "$1" | |
| echo "$timer_output" | |
| } | |
| timer-stop() { | |
| if [[ "$1" == "-h" ]]; then | |
| echo "Returns duration in seconds. Specify '8601' for an iso-8601-like output." | |
| return 1 | |
| fi | |
| timer-duration "$1" | |
| unset timer_start | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Continue in... (countdown helper) | |
| # Usage: | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| __continue_in() { | |
| count=$1 | |
| for i in $(seq 1 "$count"); do | |
| printf "\r%s" "Continuing in $(( (count+1) - i ))..." | |
| sleep 1 | |
| done | |
| echo '' | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Detect processes that are maintaining many open files | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| detect_file_hogs() { | |
| local tmp line open_files pid proc_name proc_hard_limit output_lines | |
| output_lines=10 | |
| [[ $1 =~ ^[0-9]+$ ]] && output_lines=$1 | |
| tmp=$(mktemp) | |
| echo PID OPEN_FILES PROC_HARD_LIMIT PROC-NAME >> "$tmp" | |
| while read -r line; do | |
| open_files=$(echo "$line" | cut -f1 -d' ') | |
| pid=$(echo "$line" | cut -f2 -d' ') | |
| proc_name=$(ps -p "$pid" -o comm= 2>/dev/null) | |
| proc_hard_limit=$(awk '/files/ {print $5; exit}' "/proc/$pid/limits" 2>/dev/null || echo -n "") | |
| echo "$pid $open_files $proc_hard_limit $proc_name" >> "$tmp" | |
| done <<< "$(lsof 2>/dev/null | awk '{print $2}' | sort | uniq -c | sort -rn | head -n "$output_lines")" | |
| column -t "$tmp" | |
| rm -f "$tmp" | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Daily Notes | |
| # Usage: | |
| # Add entry: | |
| # dnote Did some thing | |
| # Show todays notes | |
| # dnote | |
| # Show yesterdays notes | |
| # dnote -y | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| daily_notes() { | |
| local DAILY_NOTES_FILE DATE | |
| if [[ $# -eq 0 ]]; then | |
| if [[ $(awk "/## ${DATE}/,/^$/" "$DAILY_NOTES_FILE" | wc -l) -eq 0 ]]; then | |
| printf '%s\n' 'No entries yet for today. Here are yesterdays notes.' | |
| awk "/## $(date --utc -d "yesterday" +'%Y-%m-%d')/,/^$/" "$DAILY_NOTES_FILE" | |
| return | |
| else | |
| awk "/## $(date --utc +'%Y-%m-%d')/,/^$/" "$DAILY_NOTES_FILE" | |
| return | |
| fi | |
| fi | |
| if [[ $1 == '-y' ]]; then | |
| awk "/## $(date --utc -d "yesterday" +'%Y-%m-%d')/,/^$/" "$DAILY_NOTES_FILE" | |
| return | |
| fi | |
| DAILY_NOTES_FILE="/home/$USER/DailyNotes.md" | |
| DATE=$(date --utc +'%Y-%m-%d') | |
| [[ ! -f "$DAILY_NOTES_FILE" ]] && printf '%s\n' '# Daily Notes' > "$DAILY_NOTES_FILE" | |
| if ! grep -q "$DATE"<<<"$(head -n 20 "$DAILY_NOTES_FILE")"; then | |
| sed -i "1 a \#\# $DATE\n" "$DAILY_NOTES_FILE" | |
| sed -ie "0,/^$/ s/^$/- $*\n/" "$DAILY_NOTES_FILE" | |
| else | |
| sed -ie "0,/^$/ s/^$/- $*\n/" "$DAILY_NOTES_FILE" | |
| fi | |
| } | |
| alias dnote='daily_notes' | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # JWT Tooling: Create JWT Key/Cert | |
| # IE: For PHP-JWT | |
| # Outputs: | |
| # - RSA 4096 keypair | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| jwt_create_keycert(){ | |
| read -rp "Output Key File [id.key]: " output_key; output_key=${output_key:-id.key} | |
| read -rp "Output Cert File [id.cert]: " output_cert; output_cert=${output_crt:-id.cert} | |
| openssl req \ | |
| -new \ | |
| -newkey rsa:4096 \ | |
| -days 3650 \ | |
| -nodes \ | |
| -x509 \ | |
| -keyout "$output_key" \ | |
| -out "$output_cert" | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # JWT Tooling: Create JWT Token | |
| # Arguments: | |
| # - Expiry offset in seconds REQUIRED | |
| # - Subject Key REQUIRED | |
| # - Subject Value REQUIRED | |
| # - Signature Signal OPTIONAL | |
| # Outputs: | |
| # - Multi-segment base64 encoded JWT token | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| jwt_create_token() { | |
| local HEADER ISSUE_TS EXP PAY SIG JWT SECRET | |
| function help { | |
| printf '%s\n' 'Requires at least 3 arguments.' | |
| printf '%s\n' 'Usage: jwt_create <expiry offset in seconds> <subject key> <subject val>' | |
| printf '%s\n' ' EG: jwt_create 8600 user_id the_king' | |
| printf '%s\n' "Supply 'SIGN' as the fourth argument to add a signature" | |
| } | |
| if [[ $key == '-h' ]] || [[ $# -eq 0 ]] || [[ $# -lt 3 ]]; then | |
| help | |
| return | |
| fi | |
| HEADER=$(echo -n '{"alg":"HS256","typ":"JWT"}' | openssl base64 -e -A | tr '+/' '-_' | tr -d '=') | |
| ISSUE_TS=$(date +%s) | |
| EXP=$((ISSUE_TS + 1)) | |
| PAY=$(echo -n "{\"$2\":\"$3\",\"exp\":\"$EXP\"}" | openssl base64 -e -A | tr '+/' '-_' | tr -d '=') | |
| if [[ "$4" == "SIGN" ]]; then | |
| read -srp "Signature Secret: " SECRET; echo '' | |
| SIG=$(echo -n "$HEADER.$PAY" | openssl dgst -sha256 -hmac "$SECRET" -binary | openssl base64 -e -A | tr '+/' '-_' | tr -d '=') | |
| JWT="$HEADER.$PAY.$SIG" | |
| echo "$JWT" | |
| return | |
| fi | |
| JWT="$HEADER.$PAY" | |
| echo "$JWT" | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # JWT Tooling: Decode base64 JWT Token | |
| # Notes: | |
| # - Correct length with an amount of padding. | |
| # Arguments: | |
| # - JWT Token (single segment) | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| jwt_decode_base64() { | |
| local len=$((${#1} % 4)) | |
| local result="$1" | |
| if [ $len -eq 2 ]; then | |
| result="$1"'==' | |
| elif [ $len -eq 3 ]; then | |
| result="$1"'=' | |
| fi | |
| echo "$result" | tr '_-' '/+' | openssl enc -d -base64 | |
| echo '' | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # JWT Tooling: Decode Segment of a multi-segment JWT token | |
| # Notes: | |
| # - This is used by the jwt_header, jwt_payload, and | |
| # jwt_signature helpers below. | |
| # Arguments: | |
| # - Segment to decode | |
| # - Multi-segment JWT token | |
| # Cyclomatic Complexity: 7 | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| jwt_decode_token() { | |
| local SECRET SIG | |
| [[ $( (tr -dc . | wc -c)<<<"$*" ) -eq 0 ]] && jwt_decode_base64 "$2" | |
| [[ $( (tr -dc . | wc -c)<<<"$*" ) -ge 1 ]] && { [[ "$1" -eq 1 ]] || [[ "$1" -eq 2 ]] ;} && jwt_decode_base64 "$(echo -n "$2" | cut -d "." -f "$1")" | jq . | |
| if [[ $( (tr -dc . | wc -c)<<<"$*" ) -eq 2 ]] && [[ "$1" -eq 3 ]]; then | |
| read -srp "Signature Secret: " SECRET; echo '' | |
| SIG=$( | |
| echo -n "$( (cut -d"." -f1)<<<"$2" ).$( (cut -d"." -f2)<<<"$2" )" \ | |
| | openssl dgst -sha256 -hmac "$SECRET" -binary \ | |
| | openssl base64 -e -A \ | |
| | tr '+/' '-_' \ | |
| | tr -d '=') | |
| [[ "$SIG" == "$( (cut -d"." -f3)<<<"$2" )" ]] && loggerx SUCCESS "Signature OK" | |
| fi | |
| } | |
| alias jwt_header="jwt_decode_token 1" # Decode JWT header | |
| alias jwt_payload="jwt_decode_token 2" # Decode JWT Payload | |
| alias jwt_signature="jwt_decode_token 3" # Verify JWT Signature | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # JWT Tooling: Automatically handle whichever length of | |
| # token may be provided. | |
| # Arguments: | |
| # - Segment(s) to decode | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| jwt_decoder() { | |
| [[ $( (tr -dc . | wc -c)<<<"$*" ) -eq 0 ]] && jwt_decode_token 1 "$*" | |
| if [[ $( (tr -dc . | wc -c)<<<"$*" ) -eq 1 ]]; then | |
| jwt_decode_token 1 "$*" | |
| jwt_decode_token 2 "$*" | |
| fi | |
| if [[ $( (tr -dc . | wc -c)<<<"$*" ) -eq 2 ]]; then | |
| jwt_decode_token 1 "$*" | |
| jwt_decode_token 2 "$*" | |
| jwt_decode_token 3 "$*" | |
| fi | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Misc : System | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| alias __system_update='source <(curl -s https://gist.githubusercontent.com/AlexAtkinson/27b12f4dfda31b1b74fcab3fc9a6d192/raw/init.sh)' | |
| alias __update_system='__system_update' # Alias for __system_update | |
| alias __sysctl_update='sudo sysctl --system' # Apply _all_ sysctl config changes | |
| alias __reload_daemons='sudo systemctl daemon-reload' # Reload systemd manager configuration | |
| alias __list_services='systemctl list-units --type=service --all' # List all services | |
| alias __list_active_services='systemctl list-units --type=service --state=active' # List active services | |
| alias __list_failed_services='systemctl --failed --type=service' # List failed services | |
| alias __journalctl_follow='sudo journalctl -f --no-pager' # Follow journalctl logs | |
| alias __journalctl_boot='sudo journalctl -b --no-pager' # Show journalctl logs for current boot | |
| alias __journalctl_boot_all='sudo journalctl -b -1 --no-pager' # Show journalctl logs for previous boot | |
| alias __list_timers='systemctl list-timers --all' # List all systemd timers | |
| alias __check_disk_health='sudo smartctl -a /dev/sda' # Check disk health (adjust /dev/sda as needed) | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Update .bashrc_user_gist from remote gist | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| __update_bashrc() { | |
| local LOCAL_FILE REMOTE_FILE_URL | |
| LOCAL_FILE="$HOME/.bashrc_user_gist" | |
| REMOTE_FILE_URL="https://gist.githubusercontent.com/AlexAtkinson/bc765a0c143ab2bba69a738955d90abd/raw/.bashrc" | |
| TASK="Retrieve remote .bashrc_user_gist" | |
| curl -sS "$REMOTE_FILE_URL" -o "$LOCAL_FILE.new"; rc 0 KILL | |
| TASK="Update local .bashrc_user_gist" | |
| mv "$LOCAL_FILE.new" "$LOCAL_FILE"; rc 0 KILL | |
| loggerx SUCCESS ".bashrc_user_gist updated. Run 'source ~/.bashrc_user_gist', or 'urc' to apply." | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Kill processes with a command matching input string(s). | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| alias __kill_by_name='pkill -f' # Kill process by name | |
| __kill_by_command_match() { | |
| [[ $# -eq 0 ]] && { echo "Usage: __kill_by_command_match <command match>"; return 1; } | |
| pgrep -u ${UID} "$@" | \ | |
| grep post-commit | \ | |
| grep -v grep | \ | |
| awk '{print $2}' | \ | |
| xargs kill -9 -- | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Networking | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| alias __dns_flush='sudo systemd-resolve --flush-caches' # Flush DNS cache | |
| #alias _check_myip='curl -s https://ipinfo.io/ip' # Get public IP address | |
| alias _check_ipv4='curl -s -4 https://icanhazip.com' # Get public IPv4 address | |
| alias _check_myip='_check_ipv4' # Get public IP address | |
| alias _check_ipv6='curl -s -6 https://icanhazip.com' # Get public IPv6 address | |
| # shellcheck disable=2142 | |
| alias _check_localip="hostname -I | awk '{print \$1}'" # Get local IP address | |
| alias _check_ports_listening='sudo lsof -i -P -n | grep LISTEN' # Show listening ports | |
| _check_net_default_routes() { # Interfaces with default routes | |
| resolvectl | \ | |
| awk '/^Link/{a=1; buf=""} /DNS Servers/{c=1} {buf=buf $0 ORS} /Default Route: yes/{if (a && c) printf "%s\n", buf; a=c=0}' | \ | |
| cat -p -P -l cpp --paging=never --color=always 2>/dev/null # cpp because it highlights reasonably well | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Connection Counts | |
| # Notes: | |
| # - Uses `ss` command | |
| # Outputs: | |
| # - Connection counts by state | |
| # Cyclomatic Complexity: 6 | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| _constat() { | |
| tmpfile=$(mktemp) | |
| ss -aH 2>/dev/null > "$tmpfile" | |
| printf "CONNECTION COUNTS (UDP,TCP):\n" | |
| printf -- "-----------------\n" | |
| for i in UNCONN LISTEN ESTAB FIN-WAIT-1 CLOSE-WAIT FIN-WAIT-2 LAST-ACK CLOSING TIME-WAIT; do | |
| printf '%s\n' "${i}: $(grep -ci "${i}" "${tmpfile}")" | |
| done | column -t | |
| printf '%s\n' "TOTAL: $(grep -c ^ "${tmpfile}")" | |
| rm -f "$tmpfile" | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Misc : TUX (Terminal User Experience) (TM) | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # CLIX | |
| alias urc='source ~/.bashrc' # Update rc in current terminal | |
| alias c='clear' # keystrokes -- the clicky killer | |
| command -v batcat >/dev/null 2>&1 && alias bat='batcat' # Bat alias for Debian-based systems | |
| # NOTE pager behavior of batcat. Use --paging=never (or -P) to disable. | |
| command -v bat >/dev/null 2>&1 && alias cat='bat -p' # Cat alias to use bat if available | |
| batcat_langcolor_help() { | |
| printf "%s\n" "Loop though supported languages to discover which may work for your use case." | |
| printf "%s\n" "This example shows iptables rules colorized." | |
| # shellcheck disable=2016 | |
| printf "%s\n" ' | |
| for lang in $(cat --list-languages | cut -d: -f2- | awk -F'\'','\'' '\''{print $1}'\''); do | |
| printSectionHeader "$lang" | |
| iptables -S | cat -l "$lang" | |
| sleep 2 | |
| done' | cat -P -l bash --paging=never --color=always 2>/dev/null | |
| } | |
| alias less='less -R' # Colorize less | |
| alias l1='ls -1' | |
| alias watch='watch --color' # Colorize watch | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Clipboard helpers | |
| #[[ ! $(command -v xclip) ]] && loggerx ERROR "${BASH_SOURCE[0]} - xclip not available!" | |
| # Check disabled as dependencies are managed by the system updater. | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| alias clipc="xclip -selection c" | |
| alias clipp="xclip -selection c -o" | |
| alias clipv="clipp | less" | |
| # Tmux Control | |
| alias tmuxn='tmux new-session -s' | |
| alias tmuxk='tmux kill-session -t' | |
| alias tmuxa='tmux attach-session -t' | |
| alias tmuxl='tmux ls' | |
| tmuxwhereami() { | |
| if [[ -n $TMUX ]]; then | |
| printf '%b\n' "\e[01;39mINFO\e[0m: Current TMUX session: $(tmux display-message -p '#S')" | |
| else | |
| printf '%b\n' "\e[01;31mERROR\e[0m: Not in a TMUX session." | |
| fi | |
| } | |
| # Common colorization helper | |
| # Describes for accessibility & terminal color drift | |
| color_helper() { | |
| printf '%b\n' "\nColorized Severity (rfc5424 - https://hackmd.io/@njjack/syslogformat)" ;\ | |
| printf '%b\n' "\e[01;30;41mEMERGENCY\e[0m \\\e[01;30;41mEMERGENCY\\\e[0m : 0 - Bold BLACK text, RED background" ;\ | |
| printf '%b\n' "\e[01;31;43mALERT\e[0m \\\e[01;31;43mALERT\\\e[0m : 1 - Bold RED text, YELLOW background" ;\ | |
| printf '%b\n' "\e[01;97;41mCRITICAL\e[0m \\\e[01;97;41mCRITICAL\\\e[0m : 2 - Bold WHITE text, RED background" ;\ | |
| printf '%b\n' "\e[01;31mERROR\e[0m \\\e[01;31mERROR\\\e[0m : 3 - Bold RED text" ;\ | |
| printf '%b\n' "\e[01;33mWARNING\e[0m \\\e[01;33mWARNING\\\e[0m : 4 - Bold YELLOW text" ;\ | |
| printf '%b\n' "\e[01;30;107mNOTICE\e[0m \\\e[01;30;107mNOTICE\\\e[0m : 5 - Bold BLACK text, WHITE background" ;\ | |
| printf '%b\n' "\e[01;39mINFORMATIONAL\e[0m \\\e[01;39mINFORMATIONAL\\\e[0m : 6 - Bold WHITE text" ;\ | |
| printf '%b\n' "\e[01;97;46mDEBUG\e[0m \\\e[01;97;46mDEBUG\\\e[0m : 7 - Bold WHITE text, CYAN background" ;\ | |
| printf '%b\n' "\e[01;32mSUCCESS\e[0m \\\e[01;32mSUCCESS\\\e[0m : 9 - Bold GREEN text (non-rfc5424)\n" | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Show the age of a file. | |
| # Outputs | |
| # - The age of a file in d,h,m,s. | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| file_age() { | |
| [[ ! -f $1 ]] && return 1 | |
| local _file b s | |
| _file="$1" | |
| b="$(date --date="$(stat "$_file" | awk '/Birth:/ {print $2"T"$3}' | cut -d. -f1)" +%s)" | |
| s=$(($(date +%s) - b)) | |
| printf '%dd,%dh,%dm,%ds\n' \ | |
| $((s/86400)) \ | |
| $((s%86400/3600)) \ | |
| $((s%3600/60)) \ | |
| $((s%60)) | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # List 10 largest directories for a given path. | |
| # Arguments: | |
| # - PATH | |
| # Outputs: | |
| # - List of 10 largest directories in descending order | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| find-large-dirs() { | |
| if [[ $# -ne 1 || $1 == "-h" ]]; then | |
| loggerx ERROR "Exactly one argument required: path (eg: / or /tmp/)" | |
| return 1 | |
| fi | |
| du -hsx "$1*" 2> >(grep -v '^du: cannot \(access\|read\)' >&2) | sort -rh | head -10 | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Updates libffmpeg.so | |
| # Notes: | |
| # - This is handled by the system updater now. | |
| # Cyclomatic Complexity: 7 | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| update-ffmpeg() { | |
| local INSTALL_DIRS=( '/usr/local/bin/ffmpeg' '/usr/lib/chromium-browser' '/usr/lib/x86_64-linux-gnu/opera') | |
| local FORCE | |
| [[ "$1" == "-f" ]] && FORCE='true' | |
| for dir in "${INSTALL_DIRS[@]}"; do | |
| [[ ! -f "$dir/libffmpeg.so" ]] && local FORCE='true' | |
| if [[ ! -d "$dir" ]]; then | |
| et | |
| sudo mkdir -p "$dir" | |
| rc 0 KILL | |
| TASK="Create $dir"; rc 0 | |
| fi | |
| done | |
| TASK="Detect installed version" | |
| local LOCAL_VERSION OS_TYPE LATEST_VERSION ASSET TEMP_DIR | |
| LOCAL_VERSION="$(< "${INSTALL_DIRS[0]}/ffmpeg.version")"; rc 0 | |
| TASK="Detect latest release version" | |
| LATEST_VERSION=$(git-latest-release-version nwjs-ffmpeg-prebuilt nwjs-ffmpeg-prebuilt); rc 0 | |
| if [[ "$LOCAL_VERSION" == "$LATEST_VERSION" ]] && [[ "$FORCE" != 'true' ]]; then | |
| loggerx INFO "Latest version of ffmpeg ($LATEST_VERSION) is already installed." | |
| return 0 | |
| fi | |
| TASK="Update from $LOCAL_VERSION to $LATEST_VERSION"; et | |
| OS_TYPE=$(_check_os_type) | |
| ASSET=$(git-latest-release-assets nwjs-ffmpeg-prebuilt nwjs-ffmpeg-prebuilt \ | |
| | grep -i "$OS_TYPE" \ | |
| | grep -i "$ARCH"); rc 0 KILL | |
| TEMP_DIR=$(mktemp -d) | |
| cd "$TEMP_DIR" || true | |
| TASK="Retrieve $ASSET"; et | |
| wget -q "https://github.com/nwjs-ffmpeg-prebuilt/nwjs-ffmpeg-prebuilt/releases/download/${LATEST_VERSION}/${ASSET}"; rc 0 KILL | |
| TASK="Unpack $ASSET"; et | |
| unzip -q "$ASSET"; rc 0 KILL | |
| for dir in "${INSTALL_DIRS[@]}"; do | |
| TASK="Install libffmpeg.so to $dir"; et | |
| sudo cp libffmpeg.so "$dir"; rc 0 | |
| TASK="Update ${dir}/ffmpeg.version"; et | |
| sudo rm "${dir}/ffmpeg.version"; | |
| echo "$LATEST_VERSION" | sudo tee "${dir}/ffmpeg.version" > /dev/null; rc 0 | |
| done | |
| cd - >/dev/null 2>&1 || true | |
| TASK="Update ffmpeg from $LOCAL_VERSION to $LATEST_VERSION"; rc 0 | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # List Only Directories. Wrapper for 'ls -d */' | |
| # Notes: | |
| # - Links to directories are output | |
| # Arguments: | |
| # Supported 'ls' arguments which do not require options, | |
| # such as -1. | |
| # Outputs: | |
| # List of directories | |
| # TODO: | |
| # - Colorize links differently. | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| lsd() { | |
| local D_RAW ARGS D | |
| __load_D() { | |
| for i in $D_RAW; do # Ensure all items have a tailing / | |
| [[ ! "$i" =~ '/'$ ]] && i="${i}/" | |
| D+=("$i") | |
| done | |
| } | |
| __do_lsd() { | |
| # shellcheck disable=2035 | |
| [[ "${D[*]}" == '' ]] && ls -"${ARGS}" */ # Do at least once for assumed $PWD | |
| for i in "${D[@]}"; do | |
| ls -"${ARGS}" "$i"*/ | |
| done | |
| } | |
| ARGS='ld' | |
| D=() | |
| { [[ "${1:0:1}" == '-' ]] && [[ "$1" != "${1/1/}" ]] ; } && ARGS='d' # Remove conflicting list (l) argument | |
| [[ "${1:0:1}" == '-' ]] && ARGS="${ARGS}${1#-}" | |
| if [[ "${1:0:1}" == '-' ]]; then | |
| D_RAW=${*:2} | |
| __load_D | |
| else | |
| D_RAW="$*" | |
| __load_D | |
| fi | |
| __do_lsd | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # List Only Directories. Alternative to lsd | |
| # Notes: | |
| # - Links to directories ARE NOT output. | |
| # Arguments: | |
| # Supported 'ls' arguments which do not require options, | |
| # such as -1. | |
| # Outputs: | |
| # List of directories | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| ld() { | |
| __load_D() { | |
| for i in $D_RAW; do | |
| D+=("$i") | |
| done | |
| } | |
| __do_ld() { | |
| # shellcheck disable=2012 | |
| if [[ "$ONE" != 'true' ]]; then | |
| for i in "${D[@]}"; do | |
| ls -"${ARGS}" "$i" | tail -n +4 | grep --color=never '^d' | |
| done | |
| [[ ${#D[*]} -eq 0 ]] && ls -"${ARGS}" | tail -n +4 | grep --color=never '^d' # At least once | |
| true | |
| else | |
| for i in "${D[@]}"; do | |
| ls -"${ARGS}" "$i" | tail -n +4 | grep '^d' | awk '{print $NF}' | |
| done | |
| [[ ${#D[*]} -eq 0 ]] && ls -"${ARGS}" | tail -n +4 | grep '^d' | awk '{print $NF}' # At least once | |
| true | |
| fi | |
| } | |
| ARGS='la' | |
| D=() | |
| { [[ "${1:0:1}" == '-' ]] && [[ "$1" != "${1/1/}" ]] ; } && local ONE='true' | |
| [[ "${1:0:1}" == '-' ]] && ARGS="${ARGS}${1#-}" | |
| if [[ "${1:0:1}" == '-' ]]; then | |
| D_RAW=${*:2} | |
| __load_D | |
| else | |
| D_RAW="$*" | |
| __load_D | |
| fi | |
| __do_ld | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Terraform | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| alias tf='terraform' | |
| alias tfc='tf console' | |
| alias tfv='tf validate' | |
| alias tff='tf fmt' | |
| alias tfp='tf plan' | |
| alias tfi='tf init' | |
| alias tfa='tf apply' | |
| alias tfw='tf workspace' | |
| alias tfws='tfw show' | |
| alias tfwl='tfw list' | |
| alias tfwsl='tfw select' | |
| alias tfwn='tfw new' | |
| alias tfwd='tfw delete' | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Convert spaces to tabs. | |
| # Arguments: | |
| # 2,4 The size of the space-tab to convert | |
| # FILE The file to handle. | |
| # Not compatible with STRINGS | |
| # STRINGS The strings to handle. | |
| # Not compatible with FILE | |
| # Outputs: | |
| # The converted text (to stdout) | |
| # Usage: | |
| # spaces_to_tabs 4 File > F.tmp; mv F.tmp File | |
| # WARNING: Don't truncate our original file. | |
| # TODO: Optimize | |
| # Tags: utility, awful, this-is-a-pet | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # shellcheck disable=2001 | |
| spaces_to_tabs() { | |
| if [[ $1 -ne 2 ]] && [[ $1 -ne 4 ]]; then | |
| loggerx ERROR "First argument must be one of: (2,4)" | |
| return 1 | |
| fi | |
| if [[ ! -f $2 ]]; then | |
| [[ $1 -eq 4 ]] && sed 's/^\s\s\s\s/\t/g'<<<"${@:2}" | |
| [[ $1 -eq 2 ]] && sed 's/^\s\s/\t/g'<<<"${@:2}" | |
| else | |
| [[ $1 -eq 4 ]] && sed 's/^\s\s\s\s/\t/g' "$2" | |
| [[ $1 -eq 2 ]] && sed 's/^\s\s/\t/g' "$2" | |
| fi | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Convert all string characters to lower, or upper. | |
| # Usage: | |
| # Pipe to function. IE: echo HI | lower | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| lower() { tr '[:upper:]' '[:lower:]' ; } | |
| upper() { tr '[:lower:]' '[:upper:]' ; } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Escape string for bash usage. | |
| # Usage: | |
| # Pipe to function. IE: escape_string <<< "Hello World!" | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| escape_string() { sed -e 's/[^a-zA-Z0-9,._+@%/-]/\\&/g; 1{$s/^$/""/}; 1!s/^/"/; $!s/$/"/'; } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Prints a basic title. | |
| # Arguments: | |
| # STRING(s) Title Text | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # shellcheck disable=2183,2059 | |
| printTitle() { | |
| local txt | |
| txt="$*" | |
| printf '%*s' "$((COLUMNS-(COLUMNS-$(wc -c<<<"$txt")-3)))" | tr ' ' \# | |
| printf "\n\e[01m# ${txt} #\e[0m" | |
| printf '\n%*s' "$((COLUMNS-(COLUMNS-$(wc -c<<<"$txt")-3)))" | tr ' ' \# | |
| printf '\n' | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Prints a basic section header. | |
| # Arguments: | |
| # STRING(s) Header Text | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| printSectionHeader_v1 () { | |
| local txt | |
| txt="$*" | |
| printf "\n\e[01m%s\e[0m" "$txt" | |
| # shellcheck disable=2183 | |
| printf '\n%*s' "$(tr -d '\n'<<<"$txt" | wc -c)" | tr ' ' - | |
| printf '\n' | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Prints a basic section header (v2). | |
| # Arguments: | |
| # STRING(s) Header Text | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| printSectionHeader () | |
| { | |
| local txt wrap | |
| txt="$*" | |
| # shellcheck disable=2183 | |
| wrap="$(printf '%*s' "$(tr -d '\n'<<<"$txt" | wc -c)" | tr ' ' \~)" | |
| printf "\n" | |
| printf "\e[01m%s\e[0m\n" "# $wrap" | |
| printf "\e[01m%s\e[0m\n" "# $txt" | |
| printf "\e[01m%s\e[0m\n" "# $wrap" | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Figlet Favs | |
| # Arguments: | |
| # STRING(s) Text to render | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| figlet_fav_ansi_shadow() { | |
| /usr/bin/figlet -t -f ANSI_Shadow "$*" | sed '/^[[:space:]]*$/d' | |
| } | |
| alias printFigletHeader='figlet_fav_ansi_shadow' | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Prints a fancy section header. | |
| # See help menu for details. | |
| # TODO: Deengineer... This is over engineered... It was trivial once. | |
| # WARN: Cyclomatic Complexity = 21 | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Test with: | |
| # for opt in " " "-e"; do for i in 0 $(($(tput cols) / 8)) $(($(tput cols) / 8 +1)) $(($(tput cols) / 2)) $(($(tput cols) +10)); do for t in "H" "XX"; do printFancyHeader -w $i -g -t "$t" "$opt" ; done; done; done | |
| # for opt in " " "-e"; do printFancyHeader -t "$(printf '%*s' "$(( $(tput cols) +10 ))" | sed "s/ /X/g")" "$opt"; done | |
| function printFancyHeader() { | |
| help() { | |
| cat << EOF | |
| Print a fancy heading. | |
| Use: ${0##*/} [-w <int>] [-f <bool>] [-v <char>] [-h <char>] [-i <char>] [-t <string(s)>] | |
| -w INT The width of the header Default: Full Screen | |
| Minimum: Title + 8 | |
| Note: If odd, this program will subtract 1. | |
| -v CHAR Vertical border character Default: \# | |
| -h CHAR Horizontal border character Default: = | |
| -i CHAR Internal fill character Default: - | |
| -t TITLE "Title Text" (double quoted) Default: Hello World! | |
| -f Fill Default: true | |
| -g Gutters Default: false | |
| -e Emoji Mode Default: Disabled | |
| Fullwidth/thick (DOUBLE SPACE) emoji's only. | |
| Auto-transforms: | |
| [a-zA-Z0-9]( :;<>?!#$&()*+-_~|{}@[]^) | |
| Some characters require escapes(\). | |
| 🛸 https://www.amp-what.com/unicode/search/fullwidth 🛸 | |
| NOTE: Some characters require escaping. | |
| Examples: | |
| printFancyHeader -w 20 -t Cool Title | |
| printFancyHeader -w 60 -v \* -h \~ -i . -t "Cool Title" -f false | |
| printFancyHeader -v 🐵 -h 🛻 -i 🌭 -t "Monkey Business" -e -g -w $(($(tput cols)/2)) | |
| printFancyHeader -t "[ Hello World ]" -e -w 44 | |
| EOF | |
| } | |
| visual_width() { | |
| local str width char char_bytes | |
| str="$1" | |
| if width=$(echo -n "$str" | wc -L 2>/dev/null); then | |
| echo "$width" | |
| return | |
| fi | |
| local width=0 | |
| local i | |
| for ((i=0; i<${#str}; i++)); do | |
| char="${str:$i:1}" | |
| char_bytes=$(echo -n "$char" | wc -c) | |
| if (( char_bytes > 1 )); then # Emoji | |
| ((width += 2)) | |
| else # ASCII | |
| ((width += 1)) | |
| fi | |
| done | |
| echo "$width" | |
| } | |
| repeat_to_width() { | |
| local char target_width current_width result char_width count i max_iterations iterations space_char | |
| char="$1" | |
| target_width="$2" | |
| space_char="$3" | |
| result="" | |
| char_width=$(visual_width "$space_char") | |
| (( char_width == 0 )) && char_width=1 | |
| if (( target_width <= 0 )); then | |
| echo "" | |
| return | |
| fi | |
| (( count = target_width / char_width )) | |
| for ((i=0; i<count; i++)); do | |
| result+="$char" | |
| done | |
| current_width=$(visual_width "$result") | |
| max_iterations=10 | |
| iterations=0 | |
| while (( current_width < target_width && iterations < max_iterations )); do | |
| result+="$space_char" | |
| current_width=$(visual_width "$result") | |
| ((iterations++)) | |
| done | |
| while (( current_width > target_width && ${#result} > 0 )); do | |
| result="${result::-$char_width}" | |
| current_width=$(visual_width "$result") | |
| done | |
| echo "$result" | |
| } | |
| # Emoji Character Dictionary | |
| declare -rA e_c=( | |
| ["a"]="a" ["b"]="b" ["c"]="c" ["d"]="d" ["e"]="e" ["f"]="f" ["g"]="g" | |
| ["h"]="h" ["i"]="i" ["j"]="j" ["k"]="k" ["l"]="l" ["m"]="m" | |
| ["n"]="n" ["o"]="o" ["p"]="p" ["q"]="q" ["r"]="r" ["s"]="s" ["t"]="t" | |
| ["u"]="u" ["v"]="v" ["w"]="w" ["x"]="x" ["y"]="y" ["z"]="z" | |
| ["A"]="A" ["B"]="B" ["C"]="C" ["D"]="D" ["E"]="E" ["F"]="F" ["G"]="G" | |
| ["H"]="H" ["I"]="I" ["J"]="J" ["K"]="K" ["L"]="L" ["M"]="M" | |
| ["N"]="N" ["O"]="O" ["P"]="P" ["Q"]="Q" ["R"]="R" ["S"]="S" ["T"]="T" | |
| ["U"]="U" ["V"]="V" ["W"]="W" ["X"]="X" ["Y"]="Y" ["Z"]="Z" | |
| ["0"]="0" ["1"]="1" ["2"]="2" ["3"]="3" ["4"]="4" | |
| ["5"]="5" ["6"]="6" ["7"]="7" ["8"]="8" ["9"]="9" | |
| [" "]=" " [":"]=":" [";"]=";" ["<"]="<" [">"]=">" ["?"]="?" ["!"]="!" | |
| ["#"]="#" ["$"]="$" ["&"]="&" ["("]="(" [")"]=")" ["*"]="*" ["-"]="-" | |
| ["+"]="+" ["_"]="_" ["~"]="~" ["|"]="|" ["{"]="{" ["}"]="}" ["@"]="@" | |
| ["["]="[" ["]"]="]" ["^"]="^" ["."]=". " | |
| ) | |
| local txt width min_width max_width target_width \ | |
| vert_c horz_c intr_c ogut_c igut_c space_c \ | |
| vert_width ogut_width igut_width space_width \ | |
| fill gut emoji \ | |
| title_text title_visual_width \ | |
| border_width content_width fill_width_l fill_width_r \ | |
| horz_border_fill intr_fill intr_half_fill_l intr_half_fill_r \ | |
| c char_width padding max_title_width truncated current_width | |
| txt=() | |
| OPTIND=1 | |
| while getopts "w:v:h:i:fget:" OPT; do | |
| case "$OPT" in | |
| w) width="$OPTARG" ;; | |
| v) vert_c="$OPTARG" ;; | |
| h) horz_c="$OPTARG" ;; | |
| i) intr_c="$OPTARG" ;; | |
| f) fill="false" ;; | |
| g) gut="true" ;; | |
| e) emoji="true" ;; | |
| t) set -f | |
| IFS=' ' | |
| # shellcheck disable=2206 | |
| txt=($OPTARG) ;; | |
| :) echo "ERROR: -$OPTARG requires an argument." | |
| help; return 1 ;; | |
| *) help; return 1 ;; | |
| esac | |
| done | |
| shift $((OPTIND-1)) | |
| set +f | |
| max_width=$(tput cols) # Maximum box width (terminal width) | |
| width="${width:-$max_width}" # Title box width | |
| fill="${fill:-true}" # Fill option | |
| gut="${gut:-false}" # Gutter option | |
| # Title Text Control | |
| # shellcheck disable=2206 | |
| txt=(${txt[@]:-Hello World\!}) # Title text | |
| if [[ "$emoji" == "true" ]]; then # Emoji Transform | |
| c="${txt[*]}" | |
| # shellcheck disable=2207 | |
| txt=($(for ((i=0; i<${#c}; i++)); do | |
| printf "%s" "${e_c["${c:$i:1}"]}" | |
| done)) | |
| fi | |
| title_text="${txt[*]}" | |
| title_visual_width=$(visual_width "$title_text") | |
| if [[ "$emoji" != "true" ]]; then # Default characters | |
| [[ "$fill" == "false" ]] && intr_c=' ' | |
| vert_c="${vert_c:-#}" | |
| horz_c="${horz_c:-=}" | |
| intr_c="${intr_c:--}" | |
| space_c=" " | |
| ogut_c=" " | |
| igut_c=" " | |
| else | |
| [[ "$fill" == "false" ]] && intr_c="${e_c[" "]}" | |
| vert_c="${vert_c:-🛸}" | |
| horz_c="${horz_c:-🛸}" | |
| intr_c="${intr_c:-👽}" | |
| space_c="${e_c[" "]}" | |
| ogut_c="${e_c[" "]}" | |
| igut_c="${e_c[" "]}" | |
| fi | |
| if [[ "$gut" == "false" ]]; then # Gutter | |
| ogut_c="$horz_c" | |
| igut_c="$intr_c" | |
| fi | |
| vert_width=$(visual_width "$vert_c") # Element widths | |
| ogut_width=$(visual_width "$ogut_c") | |
| igut_width=$(visual_width "$igut_c") | |
| space_width=$(visual_width "$space_c") | |
| local min_fill_char_width # Minimum fill | |
| min_fill_char_width=$(visual_width "$intr_c") | |
| padding=$((2 * vert_width + 2 * igut_width + 2 * space_width + 2 * min_fill_char_width)) | |
| max_title_width=$((max_width - padding - 3)) # 3 for ellipses. Truncate if too long | |
| if (( title_visual_width > max_title_width )); then | |
| # Truncate and add ellipses | |
| truncated="" | |
| current_width=0 | |
| for ((i=0; i<${#title_text}; i++)); do | |
| char="${title_text:$i:1}" | |
| char_width=$(visual_width "$char") | |
| if (( current_width + char_width + 3 > max_title_width )); then | |
| break | |
| fi | |
| truncated+="$char" | |
| ((current_width += char_width)) | |
| done | |
| [[ "$emoji" == "true" ]] || title_text="${truncated}..." | |
| [[ "$emoji" == "true" ]] && title_text="${truncated}. . . " | |
| title_visual_width=$(visual_width "$title_text") | |
| fi | |
| min_width=$((title_visual_width + padding)) # Box Width | |
| (( width < min_width )) && width=$min_width | |
| (( width > max_width )) && width=$max_width | |
| border_width=$((width - 2 * vert_width - 2 * ogut_width)) # Border Width | |
| content_width=$((width - 2 * vert_width - 2 * igut_width)) # Content row | |
| title_with_spaces_width=$((2 * space_width + title_visual_width)) # Title row | |
| total_fill_width=$((content_width - title_with_spaces_width)) | |
| # Ensure minimum width | |
| if (( total_fill_width < 0 )); then | |
| content_width=$((title_with_spaces_width + 2)) | |
| width=$((content_width + 2 * vert_width + 2 * igut_width)) | |
| border_width=$((width - 2 * vert_width - 2 * ogut_width)) | |
| total_fill_width=2 | |
| fi | |
| fill_width_l=$((total_fill_width / 2)) | |
| fill_width_r=$((total_fill_width - fill_width_l)) # Observes odd widths | |
| # Ensure minimum fill of 1 | |
| (( fill_width_l < 1 )) && fill_width_l=1 | |
| (( fill_width_r < 1 )) && fill_width_r=1 | |
| # Width offset for emoji mode | |
| if [[ "$emoji" == "true" ]]; then | |
| (( content_width%2 )) || (( fill_width_r++ )) | |
| if (( content_width > min_width )); then | |
| (( fill_width_l > 2 )) && (( fill_width_l-- )) | |
| (( fill_width_l > 2 )) && (( fill_width_r++ )) | |
| fi | |
| fi | |
| horz_border_fill=$(repeat_to_width "$horz_c" "$border_width" "$space_c") | |
| intr_fill=$(repeat_to_width "$intr_c" "$content_width" "$space_c") | |
| intr_half_fill_l=$(repeat_to_width "$intr_c" "$fill_width_l" "$space_c") | |
| intr_half_fill_r=$(repeat_to_width "$intr_c" "$fill_width_r" "$space_c") | |
| #echo "w: $width, bw: $border_width, fl: $fill_width_l, fr: $fill_width_r, tfl: $total_fill_width, tl: $title_visual_width" | |
| printf '%s\n' "${vert_c}${ogut_c}${horz_border_fill}${ogut_c}${vert_c}" | |
| printf '%s\n' "${vert_c}${igut_c}${intr_fill}${igut_c}${vert_c}" | |
| printf '%s' "${vert_c}${igut_c}${intr_half_fill_l}" | |
| printf "%b" "\e[01;39m${space_c}${title_text}${space_c}\e[0m" | |
| printf '%s\n' "${intr_half_fill_r}${igut_c}${vert_c}" | |
| printf '%s\n' "${vert_c}${igut_c}${intr_fill}${igut_c}${vert_c}" | |
| printf '%s\n' "${vert_c}${ogut_c}${horz_border_fill}${ogut_c}${vert_c}" | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Data conversion helpers. | |
| # Arguments: | |
| # Numeric Value Amount of bits or bytes | |
| # Outputs: | |
| # Converted Value In bytes or bits | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| _bits_to_bytes() { | |
| if [[ -n "$1" ]] && [[ "$1" =~ ^[0-9]+$ ]]; then | |
| result=$(( $1 / 8 )) | |
| [[ $(( $1 % 8 )) -ne 0 ]] && result=$(( result + 1 )) | |
| echo $result | |
| else | |
| loggerx ERROR "First argument must be an integer." | |
| return 1 | |
| fi | |
| } | |
| _bytes_to_bits() { | |
| if [[ -n "$1" ]] && [[ "$1" =~ ^[0-9]+$ ]]; then | |
| echo $(( $1 * 8 )) | |
| else | |
| loggerx ERROR "First argument must be an integer." | |
| return 1 | |
| fi | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Convert bits to human readable format. | |
| # Arguments: | |
| # bits IE: '10000' will return 10.00 Kb | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # shellcheck disable=2120 | |
| bitsToHumanReadableBits() { | |
| local input i d s S | |
| if test -n "$1"; then | |
| input="$*" | |
| elif test ! -t 0; then | |
| input="$(</dev/stdin)" | |
| fi | |
| i=${input:-0} d="" s=0 S=("bits" "Kb" "Mib" "Gib" "Tib" "Pib" "Eib" "Yib" "Zib") | |
| while ((i > 1000 && s < ${#S[@]}-1)); do | |
| printf -v d ".%02d" $((i % 1000 * 100 / 1000)) | |
| i=$((i / 1000)) | |
| s=$((s + 1)) | |
| done | |
| echo "$i$d ${S[$s]}" | |
| } | |
| alias b2H='bitsToHumanReadableBits' | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Convert bytes to human readable format automatically. | |
| # Arguments: | |
| # bytes IE: '10240' will return 10.00 KB | |
| # Cyclomatic Complexity: 6 | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # shellcheck disable=2120 | |
| bytesToHumanReadable() { | |
| local input i d s S | |
| if [[ -n "$1" ]]; then | |
| input="$*" | |
| elif [[ ! -t 0 ]]; then | |
| input="$(</dev/stdin)" | |
| fi | |
| i=${input:-0} d="" s=0 S=("Bytes" "KB" "MiB" "GiB" "TiB" "PiB" "EiB" "YiB" "ZiB") | |
| while ((i > 1024 && s < ${#S[@]}-1)); do | |
| printf -v d ".%02d" $((i % 1024 * 100 / 1024)) | |
| i=$((i / 1024)) | |
| s=$((s + 1)) | |
| done | |
| echo "$i$d ${S[$s]}" | |
| } | |
| alias B2H='bytesToHumanReadable' | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Convert bits or bytes to human readable format in the | |
| # specified units. | |
| # | |
| # Arguments: | |
| # input_units Input Units | |
| # "bits" "Kb" "Mib" "Gib" "Tib" "Pib" "Eib" "Yib" "Zib" | |
| # "Bytes" "KB" "MiB" "GiB" "TiB" "PiB" "EiB" "YiB" "ZiB | |
| # value Amount of units | |
| # output_units (optional) Output unit ('bits' or 'bytes') | |
| # Outputs: | |
| # Converted value | |
| # Notes: | |
| # - Uses 1000 base for bits, 1024 base for bytes. | |
| # - Rounds to 2 decimal places. | |
| # - If output_units is not specified, will return in | |
| # optimal human readable format. | |
| # Cyclomatic Complexity: 10 | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| data_unit_converter() { | |
| if [[ "$1" == "-h" ]] || [[ "$1" == "--help" ]]; then | |
| cat << EOF | |
| Convert bits or bytes to human readable format. | |
| Usage: data_unit_converter <input_units> <value> <output_units> | |
| Examples: | |
| data_unit_converter bits 10000 bytes | |
| data_unit_converter bytes 10240 bits | |
| EOF | |
| return 0 | |
| fi | |
| local input_units value output_units i d s S multiplier | |
| if [[ -n "$1" ]] && [[ "$1" =~ ^(bits|bytes)$ ]]; then | |
| input_units="$1" | |
| else | |
| loggerx ERROR "First argument must be one of: bits, bytes" | |
| return 1 | |
| fi | |
| if [[ -n "$2" ]] && [[ "$2" =~ ^[0-9]+$ ]]; then | |
| value="$2" | |
| else | |
| loggerx ERROR "Second argument must be a numeric value." | |
| return 1 | |
| fi | |
| if [[ -n "$3" ]] && [[ "$3" =~ ^(bits|bytes)$ ]]; then | |
| output_units="$3" | |
| else | |
| loggerx ERROR "Third argument must be one of: bits, bytes" | |
| return 1 | |
| fi | |
| i=${value:-0} d="" s=0 | |
| if [[ "$input_units" == "bits" ]]; then | |
| S=("bits" "Kb" "Mib" "Gib" "Tib" "Pib" "Eib" "Yib" "Zib") | |
| multiplier=1000 | |
| else | |
| S=("Bytes" "KB" "MiB" "GiB" "TiB" "PiB" "EiB" "YiB" "ZiB") | |
| multiplier=1024 | |
| fi | |
| while ((i > multiplier && s < ${#S[@]}-1)); do | |
| printf -v d ".%02d" $((i % multiplier * 100 / multiplier)) | |
| i=$((i / multiplier)) | |
| s=$((s + 1)) | |
| done | |
| # Convert to desired output units if needed | |
| if [[ "$input_units" != "$output_units" ]]; then | |
| if [[ "$output_units" == "bits" ]]; then | |
| i=$((i * 8)) | |
| else | |
| i=$((i / 8)) | |
| fi | |
| d="" | |
| s=0 | |
| if [[ "$output_units" == "bits" ]]; then | |
| S=("bits" "Kb" "Mib" "Gib" "Tib" "Pib" "Eib" "Yib" "Zib") | |
| multiplier=1000 | |
| else | |
| S=("Bytes" "KB" "MiB" "GiB" "TiB" "PiB" "EiB" "YiB" "ZiB") | |
| multiplier=1024 | |
| fi | |
| while ((i > multiplier && s < ${#S[@]}-1)); do | |
| printf -v d ".%02d" $((i % multiplier * 100 / multiplier)) | |
| i=$((i / multiplier)) | |
| s=$((s + 1)) | |
| done | |
| fi | |
| echo "$i$d ${S[$s]}" | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Convert bytes to human readable format in a unit defined | |
| # by user input. | |
| # Arguments: | |
| # bytes IE: '10240' | |
| # unit IE: "KB" "MiB" "GiB" "TiB" "PiB" "EiB" "YiB" "ZiB" "MB" "GB" "TB" "PB" "EB" "YB" "ZB" | |
| # Outputs: | |
| # Converted value | |
| # Cyclomatic Complexity: 6 | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| bytesTo() { | |
| local input unit i d s descimal_S descimal_S_multiplier binary_S binary_S_multiplier index system multiplier | |
| binary_S=("Bytes" "KiB" "MiB" "GiB" "TiB" "PiB" "EiB" "YiB" "ZiB") | |
| binary_S_multiplier=(0 1024 1048576 1073741824 1099511627776 1125899906842624 1152921504606846976 1180591620717411303424) | |
| descimal_S=("Bytes" "KB" "MB" "GB" "TB" "PB" "EB" "YB" "ZB") | |
| descimal_S_multiplier=(0 1000 1000000 1000000000 1000000000000 1000000000000000 1000000000000000000 1000000000000000000000) | |
| if [[ -n "$1" ]] && [[ "$1" =~ ^[0-9]+$ ]]; then | |
| bytes="$1" | |
| elif [[ ! -t 0 ]]; then | |
| bytes="$(</dev/stdin)" | |
| fi | |
| if [[ -n "$2" ]] && [[ " ${binary_S[*]} " == *" $2 "* ]]; then | |
| unit="$2" | |
| elif [[ -n "$2" ]] && [[ " ${descimal_S[*]} " == *" $2 "* ]]; then | |
| unit="$2" | |
| elif [[ ! "$1" =~ ^[0-9]+$ ]] && [[ -z "$unit" ]]; then | |
| unit="$1" | |
| elif [[ -z "$unit" ]]; then | |
| unit="Bytes" | |
| fi | |
| i=${bytes:-0} | |
| d="" | |
| s=0 | |
| if [[ " ${binary_S[*]} " == *" $unit "* ]]; then | |
| S=("${binary_S[@]}") | |
| system="binary" | |
| elif [[ " ${descimal_S[*]} " == *" $unit "* ]]; then | |
| S=("${descimal_S[@]}") | |
| system="descimal" | |
| else | |
| loggerx ERROR "Unit must be one of: ${binary_S[*]}, ${descimal_S[*]}" | |
| return 1 | |
| fi | |
| for index in "${!S[@]}"; do | |
| if [[ "${S[$index]}" == "$unit" ]]; then | |
| s=$index | |
| if [[ "$system" == "binary" ]]; then | |
| multiplier=${binary_S_multiplier[$index]} | |
| else | |
| multiplier=${descimal_S_multiplier[$index]} | |
| fi | |
| i=$(( i / multiplier )) | |
| break | |
| fi | |
| done | |
| echo "$i$d ${S[$s]}" | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # PW Material Generator | |
| # Notes: | |
| # - Used by genpass | |
| # - Generates 20 characters per invocation | |
| # - Takes ~1ms per character generated on an average | |
| # system | |
| # - Will _always_ lead with an ALPHA character | |
| # - Will _always_ include _at least one SPECIAL character | |
| # when the -s argument is supplied | |
| # Arguments: | |
| # - -s Include special characters | |
| # Outputs: | |
| # - Randomly generated password 20 characters in length | |
| # TODO: | |
| # - Allow SPEC input to facilitate various tool compliance | |
| # Previously looped pwgen until a compliant string was | |
| # produced. | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| __genpass_fount() { | |
| local SPEC_SAFE ALPH_NUM ALPH_LEAD RANDUP OUTPUT_TAIL | |
| [[ "$2" == "-s" ]] && loggerx ERROR "Length must be specified after any arguments." && return 1 | |
| if [[ "$1" == "-s" ]]; then | |
| SPEC='!@#$%^&*()<>[]{}|_+-=' | |
| SPEC_SAFE="${SPEC:$(( RANDOM % ${#SPEC} )):1}" | |
| ALPH_NUM=$(openssl rand -base64 128 | tr -dc 'a-zA-Z0-9' | tr -d '\n' | head -c 128) | |
| ALPH_LEAD=$(openssl rand -base64 128 | tr -dc 'a-zA-Z' | tr -d '\n' | head -c 1) | |
| RANDUP=$(echo -n "${SPEC}${ALPH_NUM}" | fold -w 1 | shuf | tr -d "\n" | head -c 18 | head -n 1) | |
| OUTPUT_TAIL=$(echo -n "${RANDUP}${SPEC_SAFE}" | fold -w 1 | shuf | tr -d "\n") | |
| echo "${ALPH_LEAD}${OUTPUT_TAIL}" | |
| else | |
| openssl rand -base64 128 | tr -dc 'a-zA-Z0-9' | head -c 20 | head -n 1 | |
| echo '' | |
| fi | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # PW Generator | |
| # Notes: | |
| # - Same as __genpass_fount | |
| # - No length limitation | |
| # - Defaults to 20 characters | |
| # Arguments: | |
| # - -s Include special characters | |
| # - <len> int Length of the output string | |
| # Outputs: | |
| # - Randomly generated password of any length. | |
| # Cyclomatic Complexity: 7 | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| genpass() { | |
| [[ "$2" == "-s" ]] && loggerx ERROR "Length must be specified after any arguments." && return 1 | |
| local LEN="20" | |
| local FOUNT_LEN='20' | |
| if [[ "$1" == "-s" ]]; then | |
| { [[ -n $2 ]] && [[ $2 -ne $FOUNT_LEN ]] ; } && local LEN="$2" | |
| local GEN_RUNS=$(( (LEN / FOUNT_LEN) + 1 )) | |
| for ((GEN=0; GEN <= GEN_RUNS; GEN++)); do | |
| __genpass_fount -s | tr -d '\n' | |
| done | head -c "$LEN" | |
| echo '' | |
| else | |
| { [[ -n $1 ]] && [[ $1 -ne $FOUNT_LEN ]] ;} && local LEN="$1" | |
| local GEN_RUNS=$(( (LEN / FOUNT_LEN) + 1 )) | |
| for ((GEN=0; GEN <= GEN_RUNS; GEN++)); do | |
| __genpass_fount | tr -d '\n' | |
| done | head -c "$LEN" | |
| echo '' | |
| fi | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Encrypt a password with bcrypt. | |
| # Notes: | |
| # - Cannot verify with htpasswd without populating an | |
| # htpasswd file. | |
| # - Most useful for planned credential roles. | |
| # htpasswd examples | |
| # - Create htpasswd file for user 'foo' | |
| # htpasswd -cBC '12' htpasswdfile foo | |
| # - Verify password | |
| # htpasswd -v htpasswdfile foo | |
| # Arguments | |
| # Computing Time <int> Specify computing time | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| encrypt_pass_bcrypt() { | |
| read -srp "Password: " password | |
| echo '' | |
| local computing_time="$1" | |
| if [[ -z $computing_time ]]; then | |
| loggerx WARNING "Computing time not supplied. Using Default: 12." | |
| local computing_time="12" | |
| fi | |
| htpasswd -bnBC "$computing_time" "" "$password" | tr -d ':\n' | |
| echo '' | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Create an encrypted vault | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| vault_create() { | |
| function show_help() { | |
| cat << EOF | |
| Create an encrypted vault from a directory or file. | |
| Example: ${0##*/} -t <target> {-w}{-h} | |
| Arguments: | |
| -t The TARGET to encrypt. | |
| -w WIPE the target content upon completion of archive creation. | |
| WARNING: Perhaps verify archive prior to wiping content. | |
| -h Display this help menu. | |
| EOF | |
| HELPED="true" | |
| } | |
| [[ $# -eq 0 ]] && show_help | |
| OPTIND=1 | |
| while getopts "t:hw" opt; do | |
| case "$opt" in | |
| t) TARG="$OPTARG" ;; | |
| h) show_help ;; | |
| w) arg_w='set' ;; | |
| *) echo "ERROR: Unknown option!"; show_help ;; | |
| esac | |
| done | |
| shift "$((OPTIND-1))" | |
| [ "$1" = "--" ] && shift | |
| if [[ "$HELPED" == "true" ]]; then | |
| unset HELPED | |
| return 1 | |
| fi | |
| loggerx INFO "Encrypting $TARG" | |
| tar cz "${TARG}/" | openssl enc -aes-256-cbc -md sha512 -pbkdf2 -out "${TARG}.tar.gz.dat" | |
| [[ "$arg_w" == "set" ]] && wipe -rf "$TARG" | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Open an encrypted vault. | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| vault_open() { | |
| VAULT="$1" | |
| openssl enc -aes-256-cbc -md sha512 -pbkdf2 -d -in "$VAULT" | tar -xz | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Trace redirects | |
| # Notes: | |
| # - Uses curl and wget | |
| # - Outputs time taken for each hop | |
| # - Outputs server and response code for each hop | |
| # - Outputs total time taken for initial asset | |
| # Arguments: | |
| # URL The URL to trace | |
| # Outputs: | |
| # Redirect trace with timings and server/response info | |
| # Cyclomatic Complexity: 8 | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| dnsTraceRedirects() { | |
| url=$1; | |
| totalTime=0; | |
| unset run; | |
| while [[ "$run" != 'term' ]]; do | |
| ts=$(date +%s%N); | |
| curl -skI "${url}" > /dev/null 2>&1; | |
| tt=$((($(date +%s%N) - "$ts")/1000000)); | |
| totalTime=$((totalTime + tt)); | |
| [[ "$run" != 'last' ]] && server=$(wget --server-response --no-check-certificate --max-redirect 0 --tries 1 "${url}" 2>&1 | awk '/^ Server:/{print $2}'); | |
| [[ "$run" != 'last' ]] && response=$(wget --server-response --no-check-certificate --max-redirect 0 --tries 1 "${url}" 2>&1 | awk '/^ HTTP/{print $1 "("$2")"}'); | |
| [[ "$run" != 'last' ]] && printf "%s" "$tt ms : ${url}"; | |
| url=$(wget --server-response --no-check-certificate --tries 1 -O - "${url}" 2>&1 | head -n25 | awk '/^Location/{print $2; exit}'); | |
| if [[ -n $url ]]; then | |
| printf "%s\n" " [ ${server}:${response} ] >> ${url}"; | |
| else | |
| if [[ "$run" != 'last' ]]; then | |
| run=last; | |
| else | |
| if [[ "$run" == 'last' ]]; then | |
| printf "%s\n" " [ ${server}:${response} ] (Terminated)"; | |
| printf "%s\n" "Total Time (initial asset): ${totalTime} ms"; | |
| run='term'; | |
| fi; | |
| fi; | |
| fi; | |
| done | |
| } | |
| alias digna='dig +noall +answer' | |
| alias digns='dig NS +noall +answer' | |
| alias digsoa='dig SOA +noall +answer' | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # dig SOA Host Record | |
| # Arguments: | |
| # DOMAIN The domain to query | |
| # Outputs: | |
| # SOA Host from the authoritative nameserver | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| digsoahost() { | |
| digsoa | awk '{gsub(/.$/,"",$5); print $5}' | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # dig TTL Record | |
| # Arguments: | |
| # DOMAIN The domain to query | |
| # Outputs: | |
| # TTL from the authoritative nameserver | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| digttl() { | |
| dig +noall +answer "$1" @"$(digns "$1" | awk 'NR==1 {print $5}')" | awk 'NR==1 {print $2}' | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # nslookup SOA Record | |
| # Arguments: | |
| # DOMAIN The domain to query | |
| # Outputs: | |
| # SOA Record from the authoritative nameserver | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| nslookupsoa() { | |
| nslookup "$1" "$(digsoahost "$1")" | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Get TLS SANs | |
| # Arguments: | |
| # DOMAIN The domain to query | |
| # Outputs: | |
| # Subject Alternative Names from the TLS certificate | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| ssl-get-sans() { | |
| printf "Q" | openssl s_client -connect "$1":443 -servername "$1" 2>&1 | \ | |
| openssl x509 -in /dev/stdin -text -noout -certopt \ | |
| no_header,no_version,no_serial,no_signame,no_validity,no_subject,no_issuer,no_pubkey,no_sigdump,no_aux 2>&1 | \ | |
| grep -o -P "DNS:.*" | sed 's/, /\n/g' | tr -d "DNS:" | |
| } | |
| alias tls-get-sans='ssl-get-sans' | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Get TLS Supported Versions | |
| # Arguments: | |
| # DOMAIN The domain to query | |
| # Outputs: | |
| # Supported TLS versions | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| tls-get-supported-versions() { | |
| for ver in 1 1_1 1_2 1_3; do | |
| if printf "Q" | openssl s_client --connect "$1":443 -tls${ver} >/dev/null 2>&1; then | |
| [[ $ver == 1 ]] && ver='1_0' | |
| printf "%s\n" "TLS ${ver}: Supported" | |
| else | |
| [[ $ver == 1 ]] && ver='1_0' | |
| printf "%s\n" "TLS ${ver}: Not Supported" | |
| fi | |
| done | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Get TLS Expiry Date | |
| # Arguments: | |
| # DOMAIN The domain to query | |
| # Outputs: | |
| # Expiry date of the TLS certificate | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| tls-get-expiry() { | |
| local DOMAIN EXP_DATE_RAW EXP_DATE | |
| DOMAIN=$1 | |
| EXP_DATE_RAW=$( (echo -n Q \ | |
| | openssl s_client --servername "$DOMAIN" --connect "$DOMAIN":443 \ | |
| | openssl x509 --noout -dates) 2>&1 \ | |
| | grep notAfter | cut -d= -f2- | head -n1) | |
| EXP_DATE=$(date -d"$EXP_DATE_RAW" --utc +"%FT%T.%3NZ") | |
| if [[ "$EXP_DATE" < $(date --utc +"%FT%T.%3NZ") ]]; then | |
| printf 'Expiry date for %s is %s -- %b\n' "$DOMAIN" "$EXP_DATE" "\e[01;97;41mEXPIRED\e[0m" | |
| else | |
| printf 'Expiry date for %s is %s\n' "$DOMAIN" "$EXP_DATE" | |
| fi | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Get TLS Chain | |
| # Arguments: | |
| # DOMAIN The domain to query | |
| # Outputs: | |
| # Full TLS certificate chain | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| tls-get-chain() { | |
| echo "Q" | openssl s_client -showcerts -connect "$1":443 | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # URL Profile | |
| # Notes: | |
| # - Uses digna, dnsTraceRedirects, tls-get-sans, tls-get-expiry, | |
| # tls-get-supported-versions, and Qualys SSL Test | |
| # Arguments: | |
| # URL(s) One or more URLs to profile | |
| # Outputs: | |
| # Profile report for each URL | |
| # Cyclomatic Complexity: 7 | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| url_profile() { | |
| # shellcheck disable=2048 | |
| for i in $*; do | |
| printHeading "$i" | |
| printf '%b\n' "\e[01;39mDNS Records:\e[0m" | |
| digna "$i" | |
| printf '%b\n' "\n\e[01;39mRedirects:\e[0m" | |
| dnsTraceRedirects "$i" | |
| printf '%b\n' "\n\e[01;39mTLS SANS:\e[0m" | |
| tls-get-sans "$i" | |
| printf '%b\n' "\n\e[01;39mTLS EXPIRY:\e[0m" | |
| tls-get-expiry "$i" | |
| #Start the report generation. | |
| tls-get-supported-versions "$i" | |
| curl -X GET "https://www.ssllabs.com/ssltest/analyze.html?d=${i}&hideResults=on&latest" > /dev/null 2>&1 | |
| printf '%b\n' "\n\e[01;39mQualys SSL Test:\e[0m" | |
| printf '%s\n' "https://www.ssllabs.com/ssltest/analyze.html?d=${i}&hideResults=on&latest" | |
| done | |
| } | |
| geoip_lookup() { | |
| curl -sS "http://api.ipstack.com/$1?access_key=<yourkey>&output=json&fields=country_code,city" | jq -r '"\(.city), \(.country_code)"' | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Git | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Check SSH Authentication to Github | |
| # Notes: | |
| # - As a function to facilitate use with `while` | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| check_ssh_authentication_to_github() { | |
| TASK="Test SSH Authentication to GitHub"; et | |
| ssh -o "StrictHostKeyChecking accept-new" -T [email protected] 2>&1 | grep -q 'successfully authenticated' | |
| rc 0 | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Check SSH Authentication to CONFIG_REPO | |
| # Notes: | |
| # - As a function to facilitate use with `while` | |
| # Arguments: | |
| # $CONFIG_REPO | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| check_ssh_authentication_to_config_repo() { | |
| TASK="Test SSH Authentication to Private Config Repo"; et | |
| git ls-remote "$CONFIG_REPO" >/dev/null 2>&1 | |
| rc 0 | |
| } | |
| FORCE_JIRA_ID=false | |
| git-c-prefix() { | |
| # shellcheck disable=2001,2013 | |
| if [[ $FORCE_JIRA_ID == "true" ]]; then | |
| unset branch | |
| unset c_message | |
| unset branch_pass | |
| unset c_message_pass | |
| branch=$(git rev-parse --abbrev-ref HEAD) | |
| c_message=$* | |
| reg='[A-Z]{2,10}-[0-9]{1,7}' | |
| #c_prefix='DEVOPS-00: ' # Always insert a valid issue ID... | |
| [[ $branch =~ $reg ]] && branch_pass='true' | |
| if [[ $branch =~ $reg ]] && ! [[ $c_message =~ $reg ]]; then | |
| jira_id=$(sed 's/,$//' <<< "$(for i in $(grep -Eo "$reg" <<< "$branch"); do printf "%s" "$i,"; done)") | |
| c_message="${jira_id}: ${c_message}" | |
| fi | |
| [[ $c_message =~ $reg ]] && c_message_pass="true" | |
| if [[ $branch_pass != "true" ]] && [[ $c_message_pass != "true" ]] ; then | |
| loggerx ERROR "No Jira Issue ID Found!" | |
| read -rp "Enter Jira ID: " jira_id | |
| if [[ $jira_id =~ $reg ]]; then | |
| c_message="${jira_id}: ${c_message}" | |
| else | |
| loggerx ERROR "PEBCAK DETECTED! Quitting!" | |
| return 1 | |
| fi | |
| fi | |
| export c_message | |
| else | |
| c_message=$* | |
| export c_message | |
| fi | |
| } | |
| git_push_handler() { | |
| result=$(git push 2>&1) | |
| if grep -q "no upstream branch" <<< "$result" ; then | |
| cmd=$(tail -n 1 <<< "$result") | |
| cmd="${cmd#"${cmd%%[![:space:]]*}"}" | |
| loggerx WARNING "Pushing to new remote upstream" | |
| eval "$cmd" | |
| else | |
| echo "$result" | |
| fi | |
| } | |
| alias gitsuno='git status -uno' | |
| alias gitsu='git status -u' | |
| alias gits='git status .' | |
| gitrhard() { git reset --hard HEAD^; } | |
| gitrohard() { git reset --hard origin/"$(git rev-parse --abbrev-ref HEAD)"; } | |
| gitc() { git-c-prefix "$@" && git commit -m "$c_message"; } | |
| gitcp() { git-c-prefix "$@" && git commit -m "$c_message"; git_push_handler; } | |
| gitce() { git-c-prefix "$@" && git commit --allow-empty -m "$c_message"; } | |
| gitcep() { git-c-prefix "$@" && git commit --allow-empty -m "$c_message"; git_push_handler; } | |
| gitdb() { git branch -d "$1"; git push -d origin "$1"; } | |
| alias git-commit-tree='git log --graph --pretty=oneline --abbrev-commit' | |
| git-commit-grep() { git log --oneline | grep "$1" ;} | |
| #alias git-diff='git difftool -y -x sdiff HEAD^ $1 | pygmentize | less -R' | |
| git-diff() { | |
| git difftool -y -x sdiff HEAD^ "$1" | \ | |
| pygmentize | \ | |
| less -R | |
| } | |
| # shellcheck disable=2013 | |
| git-search-file-history () { | |
| local file string | |
| if [[ $# -ne 2 ]]; then | |
| echo "ERROR: Must provide exatly two arguments: <file> <search_string>. Eg:"; | |
| echo " ${FUNCNAME[0]} 'path/to/file.txt' 'fooString'" | |
| return 1; | |
| fi; | |
| file=$1; | |
| string=$2; | |
| for i in $(cut -d: -f1 <<< "$(git grep "$string" "$(git rev-list --all -- "${file}")" -- "${file}")"); | |
| do | |
| git log -n1 "$i"; | |
| done | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Get version of latest release. | |
| # Notes: | |
| # - This function strips leading '[vV]'. | |
| # Arguments: | |
| # $1 Github username/organization | |
| # $2 Repository name | |
| # Outputs: | |
| # Version of latest release. | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| git-latest-release-version() { | |
| USER="$1" | |
| REPO="$2" | |
| local VERSION | |
| VERSION=$(curl -Ss "https://api.github.com/repos/${USER}/${REPO}/tags" \ | |
| | jq -r '.[].name' \ | |
| | grep -E "$REGEX_SEMVER" \ | |
| | head -n 1) | |
| [[ "$VERSION" =~ ^[vV]* ]] && VERSION=${VERSION//^[vV]/""/} | |
| echo "$VERSION" | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Get asset names of latest release. | |
| # Notes: | |
| # Simply reports the asset list as many projects diverge | |
| # from standard OS, ARCH inclusive asset naming schemes. | |
| # Implement selection downstream. | |
| # Arguments: | |
| # $1 Github username/organization | |
| # $2 Repository name | |
| # Outputs: | |
| # Version of latest release. | |
| # Depends On: | |
| # - git-latest-release-version | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| git-latest-release-assets() { | |
| local USER="$1" | |
| local REPO="$2" | |
| local VERSION | |
| VERSION=$(git-latest-release-version "$USER" "$REPO") | |
| curl -Ss "https://api.github.com/repos/${USER}/${REPO}/releases" \ | |
| | jq -r --arg version "${VERSION}" '.[] | select (.tag_name == $version) | .assets[].name' | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # This is a constant SHA across all repos used for various operations. | |
| # It will always return '4b825dc642cb6eb9a060e54bf8d69288fbee4904'. | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| alias git_find_the_empty_tree='git hash-object -t tree /dev/null' | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Clone a gist | |
| # Notes: | |
| # - Clones gist to ~/git/alexatkinson/gists/<gist_id> | |
| # - Creates symbolic link in ~/git/alexatkinson/gist_<link_name> | |
| # - Prompts for link name | |
| # - Requires rc and et functions | |
| # Arguments: | |
| # <gist_id> | |
| # Usage: | |
| # git-clone-gist <gist_id> | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| git_clone_gist () { | |
| unset GIST_ID LINK_NAME OWD GIT_DIR GIST_DIR GIST_RAW_DIR RESP EXISTING_LINK | |
| local GIST_ID LINK_NAME OWD GIT_DIR GIST_DIR GIST_RAW_DIR RESP EXISTING_LINK | |
| [[ -z "$1" ]] && loggerx ERROR "Gist ID must be supplied!" && return 1 | |
| GIST_ID="$1" | |
| [[ -n "$2" ]] && LINK_NAME="$2" | |
| RESP="Y" | |
| OWD="$PWD" | |
| GIT_DIR="$HOME/git/alexatkinson" | |
| GIST_DIR="$GIT_DIR/gists" | |
| GIST_RAW_DIR="$GIT_DIR/gists_raw" | |
| TASK="CD to $GIST_RAW_DIR" | |
| cd "$GIST_RAW_DIR" || false ; rc 0 KILL | |
| if [[ -d "$GIST_RAW_DIR/$GIST_ID" ]]; then | |
| loggerx ERROR "Gist $GIST_ID already cloned." | |
| else | |
| TASK="Cloning Gist $GIST_ID" | |
| git clone "[email protected]:${GIST_ID}.git"; rc 0 KILL | |
| fi | |
| TASK="CD to $GIST_DIR" | |
| cd "$GIST_DIR" || false ; rc 0 KILL | |
| if EXISTING_LINK=$(find . -maxdepth 1 -type l -exec ls -la {} + | grep "$GIST_ID"); then | |
| loggerx ERROR "Symbolic link to $GIST_ID already exists." | |
| loggerx INFO "LINK: $(echo "$EXISTING_LINK" | awk '{print $9 " -> " $11}')" | |
| if ask "Create another symbolic link?" N ; then | |
| RESP="Y" | |
| else | |
| RESP="N" | |
| fi | |
| fi | |
| if [[ -z "$LINK_NAME" && "$RESP" =~ ^[Yy]$ ]]; then | |
| read -rp "Enter a name for the symbolic link: " LINK_NAME | |
| TASK="Create symbolic link $LINK_NAME -> $GIST_RAW_DIR/$GIST_ID" | |
| ln -s "$GIST_RAW_DIR/$GIST_ID" "$LINK_NAME"; rc 0 | |
| fi | |
| TASK="Return to $OWD" | |
| cd "$OWD" || false; rc 0 | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Docker | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Get total pulls for a given docker image from DockerHub. | |
| # Arguments: | |
| # <image:tag> | |
| # Usage: | |
| # dockerhub-total-pulls <image:tag> | |
| # Example: | |
| # dockerhub-total-pulls debian:latest | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| dockerhub-total-pulls() { | |
| image=$(cut -d: -f1 <<< "$1") | |
| curl -s "https://hub.docker.com/v2/repositories/$image" | \ | |
| jq -r '(paths(scalars) | select(.[-1] == "pull_count")) as $p | [ ( [ $p[] | tostring ] | join(".") ) , ( getpath($p) | tojson ) ] | join(": ")' | \ | |
| awk '{s+=$2} END {print s}' | \ | |
| xargs printf "%'d" | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Get total pulls for multiple docker images from DockerHub. | |
| # Arguments: | |
| # <image:tag> [<image:tag> ...] | |
| # Usage: | |
| # dockerhub-total-pulls-report <image:tag> [<image:tag> ...] | |
| # Example: | |
| # dockerhub-total-pulls-report debian:latest ubuntu:latest fedora:latest archlinux:latest opensuse/leap:latest rockylinux:latest almalinux:latest | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # shellcheck disable=2183,2068 | |
| dockerhub-total-pulls-report() { | |
| if [[ "$1" == "-h" ]]; then | |
| echo "Try: dockerhub-total-pulls-report debian:latest ubuntu:latest fedora:latest archlinux:latest opensuse/leap:latest rockylinux:latest almalinux:latest" | |
| return | |
| fi | |
| images=$* | |
| width=50 | |
| echo "TOTAL PULLS:" | |
| echo "------------" | |
| for image in ${images[@]}; do | |
| pulls=$(dockerhub-total-pulls "$image") | |
| printf "%s" "$image" | |
| printf "%*s" "$((COLUMNS-(COLUMNS-$(wc -c<<<"${image}${pulls}")+width)))" # | tr ' ' - | |
| printf "%s\n" "${pulls}" | |
| done | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # List tags for a given docker image from DockerHub. | |
| # Arguments: | |
| # <image> | |
| # Usage: | |
| # dockerhub_tags <image> | |
| # Example: | |
| # dockerhub_tags debian | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| dockerhub_tags() { | |
| # List tags for a given docker image | |
| if [[ $# -lt 1 ]]; then | |
| loggerx ERROR image basename must be supplied! | |
| return 1 | |
| fi | |
| image=$1 | |
| tag_count=$(curl -sS "https://registry.hub.docker.com/v2/repositories/library/$image/tags" | jq -r '.count') | |
| total_pages=$(( "$tag_count" / 100 + 1)) | |
| page=1 | |
| while [ $page -le $total_pages ]; do | |
| curl -sS "https://registry.hub.docker.com/v2/repositories/library/$image/tags?page_size=100&page=$page" | jq -r '."results"[]["name"]' | |
| (( page++ )) | |
| done | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # `docker images` wrapper | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| #alias di="docker images | grep -v '^<none>' | grep $1" | |
| di() { | |
| # A `docker images` wrapper | |
| [[ $# = 0 ]] && docker images | grep -v '^<none>' | |
| [[ $# = 0 ]] && return | |
| docker images | grep -v '^<none>' | grep "$1" | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # `docker ps` wrapper | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| alias dps='docker ps --format "table {{.Names}}\t{{.Image}}\t{{.RunningFor}}\t{{.Status}}\t{{.Ports}}"' | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # `watch docker ps` wrapper | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| dpsw() { | |
| watch -n2 'docker ps --format "table {{.Names}}\t{{.Image}}\t{{.RunningFor}}\t{{.Status}}\t{{.Ports}}"' | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Docker Aliases | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| alias dcont='docker container' | |
| alias dim='docker image' | |
| alias dvol='docker volume' | |
| alias dnet='docker network' | |
| alias dbuild='docker build -t' | |
| alias drun='docker run -it --rm' | |
| alias dlogs='docker logs -f' | |
| alias dexec='docker exec -it' | |
| alias dpsa='docker ps -a --format "table {{.Names}}\t{{.Image}}\t{{.Status}}\t{{.Ports}}"' | |
| alias dimgs='docker images --format "table {{.Repository}}\t{{.Tag}}\t{{.ID}}\t{{.Size}}"' | |
| alias dvols='docker volume ls' | |
| alias dnls='docker network ls' | |
| #alias drma='docker rm $(docker ps -a -q)' | |
| #alias drmi='docker rmi $(docker images -q)' | |
| #alias drmid='docker rmi $(docker images -f "dangling=true" -q)' | |
| alias dstop='docker stop $(docker ps -a -q)' | |
| alias dcli='docker container ls -a --format "table {{.Names}}\t{{.Image}}\t{{.Status}}\t{{.Ports}}"' | |
| alias dcleanw='docker container prune -f && docker image prune -f && docker volume prune -f && docker network prune -f' | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Kubernetes / K8s / k8s | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # shellcheck disable=1090 | |
| source <(kubectl completion bash) # Enable kubectl bash completion | |
| alias k=kubectl | |
| alias kdelp="kubectl delete pod" | |
| alias kdesp="kubectl describe pod" | |
| alias kcd="kubectl create deployment" | |
| alias kgd="kubectl get deploy" | |
| alias kgs="kubectl get svc" | |
| alias kgp="kubectl get pod" | |
| alias wkgp="watch kubectl get pod" | |
| alias kgpw="kubectl get pod -o wide" | |
| alias kl="kubectl logs" | |
| alias kds="kubectl delete svc" | |
| alias ke="kubectl exec -it" | |
| alias kpfd="kubectl port-forward deploy" | |
| alias k="kubectl" | |
| alias kp="kubectl port-forward" | |
| alias kcfm="kubeconform" | |
| alias kcfms="kubeconform -summary -output json" | |
| kubectlns() { | |
| local CTX NS | |
| CTX=$(kubectl config current-context) | |
| NS="$1" | |
| NS=$(kubectl get namespace "$NS" --no-headers --output=go-template='{{.metadata.name}}' 2>/dev/null) | |
| if [ -z "$NS" ]; then | |
| loggerx WARNING "Namespace ($1) not found! Setting default namespace." | |
| NS="default" | |
| fi | |
| kubectl config set-context "$CTX" --namespace="$NS" | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Ansible | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # ansible-playbook wrapper to use local inventory.ini if | |
| # present. | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| ansible-playbook() { | |
| [[ -f inventory.ini ]] && ansible-playbook -i ./inventory.ini "$@" | |
| [[ ! -f inventory.ini ]] && command ansible-playbook "$@" | |
| } | |
| alias ap='ansible-playbook' | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Get Ansible Facts for a given host. | |
| # Arguments: | |
| # <host> The host to get facts for (optional, | |
| # defaults to localhost) | |
| # Outputs: | |
| # JSON formatted Ansible facts | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # shellcheck disable=2120,2034 | |
| _ansible_facts() { | |
| local HOSTS | |
| [[ $1 == '-h' ]] && loggerx ERROR "Usage: _ansible_facts [<host>]" && return 1 | |
| [[ -z $1 ]] && HOSTS='localhost' || HOSTS="$1" | |
| ansible "$HOSTS" -m ansible.builtin.setup | sed '1c {' | jq . | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # AWS | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Update AWS CLI v2 | |
| # Notes: | |
| # - Works for Ubuntu. | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| awscli-update () { | |
| dir=$(mktemp -d) | |
| cd "$dir" || return | |
| curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip" | |
| unzip awscliv2.zip | |
| ./aws/install --update | |
| version=$(aws/dist/aws --version | awk '{print $1}' | cut -d/ -f2) | |
| sudo rm /usr/local/aws-cli/v2/current/bin/aws | |
| sudo ln -s "/usr/local/aws-cli/v2/${version}/bin/aws" /usr/local/aws-cli/v2/current/bin/aws | |
| cd - || return | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Update AWS VPN Client | |
| # Notes: | |
| # - Works for Ubuntu. | |
| # - Requires libssl1.1 on Ubuntu 22.04 | |
| # See: https://blog.reinhard.codes/2023/11/09/using-the-aws-vpn-client-on-ubuntu-22-04/ | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| awsvpn-update () { | |
| if [[ "$(lsb_release -ds)" =~ 22.04 ]]; then | |
| if [[ -z $(dpkg -S "libssl1.1" 2> /dev/null) ]]; then | |
| loggerx ERROR "libssl1.1 is required for awsvpn to operate in $(lsb_release -ds)! See here: https://blog.reinhard.codes/2023/11/09/using-the-aws-vpn-client-on-ubuntu-22-04/ ." | |
| return 1 | |
| fi | |
| fi | |
| dir=$(mktemp -d) | |
| cd "$dir" || return | |
| curl "https://d20adtppz83p9s.cloudfront.net/GTK/latest/awsvpnclient_amd64.deb" -o "awsvpnclient_amd64.deb" | |
| sudo dpkg -i awsvpnclient_amd64.deb | |
| cd - || return | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # AWS Profile Helper - Get current IAM username | |
| # Notes: | |
| # - If no argument is supplied, the default profile is used. | |
| # Arguments: | |
| # - AWS Profile Name (optional) | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # shellcheck disable=2120 | |
| aws-whoami() { | |
| if [[ "$1" == '-h' ]]; then | |
| loggerx ERROR "Supports zero or one argument. The argument must be a valid awscli profile name." | |
| return | |
| fi | |
| if [[ $# -eq 1 ]]; then | |
| aws --profile "$1" iam get-user --query User.UserName --output text | |
| return | |
| fi | |
| #aws iam get-user --query User.UserName --output text | |
| aws sts get-caller-identity --query "UserId" --output text | cut -d':' -f2 | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # AWS Account ID Helper - Get current AWS Account ID | |
| # Notes: | |
| # - If no argument is supplied, the default profile is used. | |
| # Arguments: | |
| # - AWS Profile Name (optional) | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| aws-account-id() { | |
| if [[ "$1" == '-h' ]]; then | |
| loggerx ERROR "Supports zero or one argument. The argument must be a valid awscli profile name." | |
| return | |
| fi | |
| if [[ $# -eq 1 ]]; then | |
| aws --profile "$1" sts get-caller-identity --query "Account" --output text | |
| return | |
| fi | |
| aws sts get-caller-identity --query "Account" --output text | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # AWS Profile Helper - Get user tags for current IAM user | |
| # Notes: | |
| # - If no argument is supplied, the default profile is used. | |
| # Arguments: | |
| # - AWS Profile Name (optional) | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| aws-get-my-tags() { | |
| if [[ "$1" == '-h' ]]; then | |
| loggerx ERROR "Supports zero or one argument. The argument must be a valid awscli profile name." | |
| return | |
| fi | |
| if [[ $# -eq 1 ]]; then | |
| aws --profile "$1" iam list-user-tags --user-name "$(aws-profile-whoami "$1")" | |
| return | |
| fi | |
| aws iam list-user-tags --user-name "$(aws-whoami)" | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # AWS ECR Login Helper | |
| # Arguments: | |
| # - region AWS region (eg: us-east-1) | |
| # - aws_profile_name AWS CLI profile name (optional) | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| aws-ecr-login() { | |
| if [[ "$1" == '-h' ]]; then | |
| loggerx ERROR "Supports one or two arguments: region (REQUIRED), aws_profile_name." | |
| return | |
| fi | |
| if [[ $# -eq 1 ]]; then | |
| aws --region "$1" ecr get-login-password | docker login --username AWS --password-stdin "$(aws-account-id "$2").dkr.ecr.${1}.amazonaws.com" | |
| return | |
| fi | |
| if [[ $# -eq 2 ]]; then | |
| aws --region "$1" --profile "$2" ecr get-login-password | docker login --username AWS --password-stdin "$(aws-account-id "$2").dkr.ecr.${1}.amazonaws.com" | |
| return | |
| fi | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # AWS Login Helper - Get temporary session tokens using MFA | |
| # Notes: | |
| # - Requires 'jq' to parse JSON response. | |
| # Arguments: | |
| # - aws_profile_name AWS CLI profile name | |
| # - token_code MFA token code | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| aws-profile-login() { | |
| if [[ $# -ne 2 ]]; then | |
| loggerx ERROR Exactly two arguments required: aws_profile_name token_code | |
| return 1 | |
| fi | |
| response=$(aws --profile "$1" sts get-session-token --serial-number "arn:aws:iam::$(aws-profile-account-id "$1"):mfa/mfa" --token-code "$2") | |
| AWS_ACCESS_KEY_ID=$(jq -r .Credentials.AccessKeyId <<< "$response") | |
| AWS_SECRET_ACCESS_KEY=$(jq -r .Credentials.SecretAccessKey <<< "$response") | |
| AWS_SECURITY_TOKEN=$(jq -r .Credentials.SessionToken <<< "$response") | |
| export AWS_ACCESS_KEY_ID | |
| export AWS_SECRET_ACCESS_KEY | |
| export AWS_SECURITY_TOKEN | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # AWS Profile Helper - Run a command across multiple profiles | |
| # Notes: | |
| # - Assumes profiles named: devops, dev, stag, prod | |
| # Arguments: | |
| # - AWS CLI command and arguments | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| aws-profile-do-all() { | |
| for env in devops dev stag prod; do | |
| printHeading $env: "$@" | |
| aws --profile $env "$@" | |
| done | |
| } | |
| alias aws-profile-sso-login='aws sso login --profile' | |
| alias aws-profiles-list='aws configure list-profiles' | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # AWS List All Actions | |
| # Notes: | |
| # - No arguments | |
| # Outputs: | |
| # - List of all AWS IAM actions in the format: | |
| # SERVICE:ACTION | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| aws-list-all-actions() { | |
| curl --header 'Connection: keep-alive' \ | |
| --header 'Pragma: no-cache' \ | |
| --header 'Cache-Control: no-cache' \ | |
| --header 'Accept: */*' \ | |
| --header 'Referer: https://awspolicygen.s3.amazonaws.com/policygen.html' \ | |
| --header 'Accept-Language: en-US,en;q=0.9' \ | |
| --silent \ | |
| --compressed \ | |
| 'https://awspolicygen.s3.amazonaws.com/js/policies.js' | | |
| cut -d= -f2 | | |
| jq -r '.serviceMap[] | .StringPrefix as $prefix | .Actions[] | "\($prefix):\(.)"' | | |
| sort | | |
| uniq | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # ██████╗ █████╗ ████████╗ █████╗ ██████╗ █████╗ ███████╗███████╗ | |
| # ██╔══██╗██╔══██╗╚══██╔══╝██╔══██╗██╔══██╗██╔══██╗██╔════╝██╔════╝ | |
| # ██║ ██║███████║ ██║ ███████║██████╔╝███████║███████╗█████╗ | |
| # ██║ ██║██╔══██║ ██║ ██╔══██║██╔══██╗██╔══██║╚════██║██╔══╝ | |
| # ██████╔╝██║ ██║ ██║ ██║ ██║██████╔╝██║ ██║███████║███████╗ | |
| # ╚═════╝ ╚═╝ ╚═╝ ╚═╝ ╚═╝ ╚═╝╚═════╝ ╚═╝ ╚═╝╚══════╝╚══════╝ | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Memcached | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # RUN a local memcached with docker. Use either memcached or bitnami/memcached. | |
| # docker run --name memcached -p 11211:11211 memcached | |
| # docker run --name bitnami-memcached -p 11211:11211 bitnami/memcached | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Set a key in memcached | |
| # Arguments: | |
| # [KEY] [VAL] (TIMEOUT) (HOST) (PORT) | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| memcached_set() { | |
| local HOST PORT T S K V | |
| if [[ $# -lt 2 ]]; then | |
| loggerx ERROR "At least 2 arguments required: ${FUNCNAME[0]} [KEY] [VAL] (TIMEOUT) (HOST) (PORT)" | |
| return 1 | |
| fi | |
| # key flags exptime bytes noreply(optional) value | |
| K=$1 | |
| V=$2 | |
| T=${3:-300} | |
| S=${#V} | |
| HOST=${4:-localhost} | |
| PORT=${5:-11211} | |
| printf '%b' "set $K 0 $T $S\r\n$V\r" | nc -q 0 "$HOST" "$PORT" | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Set multiple junk items in memcached | |
| # Arguments: | |
| # (Number Of Junk Items To Make - DEFAULT: 5) (TIMEOUT) (HOST) (PORT) | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| memcached_set_n_junk() { | |
| if [[ "$1" == "-h" ]]; then | |
| loggerx ERROR "No arguments required: ${FUNCNAME[0]} (Number Of Junk Items To Make - DEFAULT: 5) (TIMEOUT) (HOST) (PORT)" | |
| return 1 | |
| fi | |
| local NO_JUNK=${1:-5} | |
| local T=${2:-300} | |
| for i in $(seq -w 0001 "$NO_JUNK"); do | |
| echo "SETTING KEY: JUNK_$i" | |
| memcached_set JUNK_"$i" JUNK_VAL "$T" | |
| done | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Replace a key in memcached | |
| # Arguments: | |
| # [KEY] [VAL] (TIMEOUT) (HOST) (PORT) | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| memcached_replace() { | |
| local HOST PORT T S K V | |
| if [[ $# -lt 2 ]]; then | |
| loggerx ERROR "At least 2 arguments required: ${FUNCNAME[0]} [KEY] [VAL] (TIMEOUT) (HOST) (PORT)" | |
| return 1 | |
| fi | |
| K=$1 | |
| V=$2 | |
| T=${3:-300} | |
| S=${#V} | |
| HOST=${4:-localhost} | |
| PORT=${5:-11211} | |
| printf '%b' "replace $K 0 $T $S\r\n$V\r" | nc -q 0 "$HOST" "$PORT" | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Get a key from memcached | |
| # Arguments: | |
| # [KEY] (HOST) (PORT) | |
| # Outputs: | |
| # KEY VAL | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| memcached_get() { | |
| local HOST PORT K IFS_BAK IFS r KEY VAL | |
| if [[ $# -lt 1 ]]; then | |
| loggerx ERROR "At least 1 argument required: ${FUNCNAME[0]} [KEY] (HOST) (PORT)" | |
| return 1 | |
| fi | |
| K=$1 | |
| HOST=${2:-localhost} | |
| PORT=${3:-11211} | |
| IFS_BAK=$IFS | |
| IFS=$'\r' | |
| r=$(echo get "$K" | nc -q 0 "$HOST" "$PORT") | |
| [[ $(wc -l <<< "$r") -lt 3 ]] && return 1 | |
| KEY=$(echo "$r" | head -n 1 | cut -d' ' -f2) | |
| VAL=$(echo "$r" | head -n -1 | tail -n -1) | |
| IFS=$IFS_BAK | |
| echo "$KEY $VAL" | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Delete a key from memcached | |
| # Arguments: | |
| # [KEY] (HOST) (PORT) | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| memcached_delete() { | |
| local HOST PORT K | |
| if [[ $# -lt 2 ]]; then | |
| loggerx ERROR "At least 1 arguments required: ${FUNCNAME[0]} [KEY] (HOST) (PORT)" | |
| return 1 | |
| fi | |
| K=$1 | |
| HOST=${2:-localhost} | |
| PORT=${3:-11211} | |
| echo delete "$K" | nc -q 0 "$HOST" "$PORT" | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Flush all keys from memcached | |
| # Arguments: | |
| # (HOST) (PORT) | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| memcached_flush() { | |
| local HOST PORT | |
| if [[ "$1" == "-h" ]]; then | |
| loggerx ERROR "Flushes memcached. No arguments required: ${FUNCNAME[0]} (HOST) (PORT)" | |
| return 1 | |
| fi | |
| HOST=${1:-localhost} | |
| PORT=${2:-11211} | |
| echo "flush_all" | nc -q 0 "$HOST" "$PORT" | |
| } | |
| memcached_dump() { | |
| local HOST PORT | |
| # NOTE: stats cachedump doesn't return COLD items. | |
| if [[ "$1" == "-h" ]]; then | |
| loggerx ERROR "Flushes memcached. No arguments required: ${FUNCNAME[0]} (HOST) (PORT)" | |
| return 1 | |
| fi | |
| HOST=${1:-localhost} | |
| PORT=${2:-11211} | |
| for key in \ | |
| $(for slab_stat in \ | |
| $(echo "stats items" \ | |
| | nc -q 0 "$HOST" "$PORT" \ | |
| | grep ':number ' \ | |
| | cut -d':' -f2); do | |
| echo "lru_crawler metadump $slab_stat" \ | |
| | nc -q 0 "$HOST" "$PORT" \ | |
| | cut -d' ' -f1 \ | |
| | cut -d'=' -f2 \ | |
| | head -n -1 | |
| done); do | |
| memcached_get "$key" | |
| done | |
| } | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # ██████╗██╗ ██╗ ██████╗██╗ ██╗███████╗ █████╗ ████████╗███████╗ | |
| # ██╔════╝██║ ██║ ██╔════╝██║ ██║██╔════╝██╔══██╗╚══██╔══╝██╔════╝ | |
| # ██║ ██║ ██║ ██║ ███████║█████╗ ███████║ ██║ ███████╗ | |
| # ██║ ██║ ██║ ██║ ██╔══██║██╔══╝ ██╔══██║ ██║ ╚════██║ | |
| # ╚██████╗███████╗██║ ╚██████╗██║ ██║███████╗██║ ██║ ██║ ███████║ | |
| # ╚═════╝╚══════╝╚═╝ ╚═════╝╚═╝ ╚═╝╚══════╝╚═╝ ╚═╝ ╚═╝ ╚══════╝ | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # Just some reminders... | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| # _cli_cheats_network | |
| # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | |
| _cli_cheats_network() { | |
| cat <<EOF | |
| tls-foo <domain> - TLS Tools | |
| url_profile <url(s)> - Profile a URL | |
| nmcli device - List all nmcli devices | |
| nmcli device show <interface> - Show interface information | |
| geoip_lookup <ip_address> - Lookup geoip information for an IP address | |
| ss -antu | awk 'NR==1 || /LISTEN/' - List all listening network connections (ss) | |
| nmcli connection show - Show all nmcli connections | |
| ip monitor dev <DEVICE> - Monitor a network device for changes | |
| ip -br addr - Show brief IP address information | |
| EOF | |
| } | |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| #!/usr/bin/env bash | |
| # Iterate the VERSION metadata. | |
| # This simply uses the commit count as the version number. | |
| # This is fine for this use case. Similar to build number. | |
| # REF: https://gist.github.com/AlexAtkinson/7be00d6be71fab970210006b9574e1e5 | |
| # Operational notes: | |
| # - This hook does not create a new commit, it amends the last commit. | |
| # - This hook cannot reference its own commit message to avoid infinite loops. | |
| # - This compares the local VERSION metadata against the remote gist VERSION metadata. | |
| # - Due to the latency which may occur when pushing to remote repos, | |
| # a cadence file is used to limit how often this runs. | |
| # - This mitigates api rate limit issues. | |
| # - This also loop-breaks this hook from amending multiple times in quick succession. | |
| # - Don't forget that hooks operate in the root directory of the repo. | |
| __auto_version() { | |
| local LOCAL_VERSION LOCAL_FILE REMOTE_VERSION REMOTE_FILE_URL VERSION CADENCE_FILE | |
| # Don't run more than once every 2 seconds | |
| CADENCE_FILE=".git/hooks/.post-commit_autover-cadence.tmp" | |
| [[ ! -f "$CADENCE_FILE" ]] && touch "$CADENCE_FILE" | |
| if [[ $(( $(date +%s) - $(stat "$CADENCE_FILE" -c %Y) )) -le 2 ]]; then | |
| #for i in {1..5}; do | |
| # printf "\r%s" "Rate limit exceeded. Continuing in $(( 6 - $i ))..." | |
| # sleep 1 | |
| #done | |
| return | |
| fi | |
| touch "$CADENCE_FILE" | |
| VERSION=$(($(git rev-list --count --all)+1)) | |
| LOCAL_FILE=".bashrc" | |
| REMOTE_FILE_URL="https://gist.githubusercontent.com/AlexAtkinson/bc765a0c143ab2bba69a738955d90abd/raw/.bashrc" | |
| LOCAL_VERSION=$(grep -m1 '^# VERSION' "$LOCAL_FILE" | cut -d: -f2- | tr -d ' ') | |
| REMOTE_VERSION=$(curl -sS -r 0-400 "$REMOTE_FILE_URL" | grep -m1 '^# VERSION' | cut -d: -f2- | tr -d ' ') | |
| echo "Auto-version: (local: $LOCAL_VERSION, remote: $REMOTE_VERSION). Updating to $VERSION..." | |
| sed -i "s/^# VERSION : .*/# VERSION : $VERSION/" .bashrc | |
| git add .bashrc | |
| git commit --amend -C HEAD --no-verify | |
| } | |
| __auto_version |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment