Last active
November 12, 2024 14:06
-
-
Save niainaLens/4942913ea4c12157479025fed942a495 to your computer and use it in GitHub Desktop.
memo
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
### | |
PERSONAL COLLECTION OF MEMO | |
by https://github.com/niainaLens | |
### |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
## DevOps | |
+ LPIC DevOps Tools Engineer | |
## Cloud | |
+ GCP | |
+ AWS | |
+ AZURE | |
## Linux | |
+ LPIC 1, 2, 3 | |
## Network | |
+ CISCO |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
--- GCP | |
## Container | |
+ Kubernetes |Borg | |
+ Istio | |
Service Mesh | |
## Database | |
+ BigQuery | |
+ BigTable |Hbase | |
+ CloudSQL | |
+ CloudSpanner | |
+ MemoryStore | |
## Serverless | |
+ CloudFunction | |
--- AWS | |
--- AZURE |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
A | |
*A/B Testing |Release management, Delivery, Marketing | |
delivery of 2 versions of products | |
APM: Application Performance Monitoring |Monitoring, Elastic, AppDynamics, Dynatrace | |
*AT-LEAST ONCE: |Message delivery, Queue, Message broke | |
message will be delivered at least one time, but there may be duplicates | |
*AT-MOST ONCE: |Message delivery, Queue, Message broker | |
message is delivered at most one time, with the possibility of not being delivered at all | |
*ATOMIC DESIGN: reusable action & workflow |Development, DevOps, Design | |
B | |
BAFO: Best And Final Offer |Management, Project | |
BCP: Business Continuity Plan (*PCA) |Management, Agile | |
C | |
CAB: Change Advisory Board |Deployment | |
*CC-BY-NC-SA: Licence Creative Commons |Licence | |
*Canary Release |Release Management, Delivery | |
delivery to restricted users before production | |
CDN: Content Delivery Network |Dev, Network, Infrastructure | |
*Circuit Breaker: |Resilience, Design pattern | |
autostop failed service to prevent cascading failures in a distributed system | |
CPU: Central Processing Unit |Machine | |
*CUTOVER PLAN: |Agile | |
actions plan for a big change of old technology to a new one (go-live, go-prod) | |
D | |
DAN: Do Anything Now |AI | |
*Dark Launch |Deployment, Release Management | |
release feature to limited users/environments | |
DoD: Definition Of Done |Agile, Scrum | |
DBT: Data Build Tools |DBT, Data, Analytics | |
DLQ: Dead Letter Queue |Queue | |
*Dilbert Principle (*Peter Principle): |Management | |
companies tend to promote incompetent employees to management to minimize their ability to harm productivity | |
DRP: Disaster Recovery Plan |Cloud | |
*DRY-RUN: |Testing, CI/CD | |
run without applying | |
*Dunbar Number: |Management, Agile | |
theory that limit to number of people we can maintain stable social relationships at once | |
DX: Developer eXperience |Development | |
E | |
*EXACTLY ONCE: |Message delivery, Queue, Message broker | |
message is delivered exactly one time, without duplicates/omissions, then disappear | |
*EXPONENTIAL BACKOFF: |Network, Algorithm | |
progressively increases the time between successive retries | |
F | |
*FACILITY MANAGEMENT: |Management, Agile | |
management supporting people, ensures the functionality, comfort, safety, sustainability and efficiency of the built environment | |
FAIR: Findable, Accessible, Interoperable, Reusable |Data | |
*FALLBACK: |Security, DevOps | |
alternative option used when the primary plan is not successful/unavailable | |
FTT: Failure To Tolerate |Security, Cloud | |
G | |
*Gemba: |Management, Agile, Lean | |
process to Go, Look and See, how the work is done | |
GPO: Group POlicy |Security | |
GPU: Graphical Processing Unit |Machine | |
H | |
HPC: High Performance Computing | |
*HUSHLOGIN: message on login banner in Linux for individual user (suppress MOTD) |Linux | |
I | |
*Ikigai |Personal Development | |
Japanese concept for perso dev | |
*Iron Triangle: |Project management | |
Project management triangle with triple constraints: Cost, Scope, Price | |
*ISO 26000: | |
Norm RSE |RSE | |
*ISO 27001: |Security | |
international standard for information systems security to define the Information Security Management System (ISMS) | |
ISMS: Information Security Management System |Security | |
I&A: Inspect & Adapt |Agile, SAFe | |
J | |
K | |
KB: Knowledge Base |Agile, Doc | |
KISS: Keep It Simple Stupid |Agile | |
L | |
*LIFT AND SHIFT (*REHOSTING/ REPLATFORMING): |Cloud, Migration | |
Move server to cloud as it is, without redesign | |
*LIVENESS PROBE |Kubernetes, Probe, Healthcheck | |
determine if the application is running and responsive (not accept traffic yet, see Readiness Probe) | |
LUKS: Linux Unified Key Setup |Linux | |
M | |
MEAT: Most Economically Advantageous Tender |Management, Project | |
MTBF: Mean Time Before Failure |Infrastructure, Metric | |
MTTA: Mean Time To Acknowledge |Infrastructure, Metric | |
MTTF: Mean Time To Failure |Infrastructure, Metric | |
MTTR: Mean Time To Recovery |Infrastructure, Metric | |
MVP: Minimum Viable Product |Development, Agile | |
N | |
O | |
ORM: Object-Relational Mapping |Database | |
*OCKHAM'S RAZOR: |Agile, Principle, DevOps | |
less is better, eliminate all unlikely hypothesis, parsimony principle | |
-> https://www.linkedin.com/posts/jeanbaptistemusso_rasoir-dockham-wikip%C3%A9dia-activity-7019270010028052480-txnE?utm_source=share&utm_medium=member_desktop | |
P | |
*PETER PRINCIPLE: (*Dilbert Principle) |Management | |
employees are promoted based on success until they attain their "level of incompetence" and are no longer successful | |
PCA: Plan de Continuité d'Activité (*BCP) |Management, Agile | |
PCI-DSS: Payment Card Industry Data Security Standard |Security, RGPD | |
PDCA: Plan Do Check Adjust |Agile | |
PoLP: PRINCIPLE of LESS PRIVILEGE |Security | |
*PORT-KNOCKING: |Network, Security, Firewall | |
open closed ports by sending network packets containing special information | |
PQQ: Pre-Qualification Questionnaire |Management, Project | |
Q | |
*QUORUM: |Management | |
minimum number of people needed to hold meetings or make decisions during certain company meetings | |
R | |
RACI: Responsible, Accountable, Consulted, Informed |Management | |
RAT: Riskiest Assumption Test |Development, Agile, Test | |
RCU: Référentiel Client Unique (*SCV) |Database, Marketing | |
*READINESS PROBE |Kubernetes, Probe, Healthcheck | |
determine if a container is ready to accept traffic and to be plugged to the system (after Liveness Probe OK) | |
*REHOSTING (*LIFT AND SHIFT/ REPLATFORMING): |Cloud, Migration | |
Move server to cloud as it is, without redesign | |
REPL: Read-Eval-Print Loop |Programming | |
*REPLATFORMING (*LIFT AND SHIFT/ REHOSTING): |Cloud, Migration | |
Move server to cloud as it is, without redesign | |
*RESILIENCE: |Security | |
adaptation to crash, I crash Others not | |
RETEX: Retour d'Experience |Management, Agile | |
RFI: Request For Information |Management, Project | |
RFP: Request For Proposal |Management, Project | |
RFQ: Request For Quotation |Management, Project | |
ROAM: Resolved Owned Accepted Mitigated |Management, Agile, SAFe, Risk management | |
RPO: Recovery Point Objective |Management, Agile | |
RTE: Release Train Engineer |Management, Agile, SAFe | |
RTO: Recovery Time Objective |Management, Agile | |
S | |
SAST: Static Application Security Testing |Security | |
SCIM: System for Cross-domain Identity Management |IAM, Identity, Cloud | |
SCV: Single Client View (*RCU) |Database, Marketing | |
SKU: Stock Keeping Units |Azure, Cloud, Microsoft | |
SLA: Service Level Agreement |Agile, Project management, Support service | |
SLC: Simple Lovable Complete |Agile, Project management | |
*SLASHDOT EFFECT/ SLASDOTTING |Security | |
temporary surge in traffic to a website | |
SLO: Service Level Objectives |Agile, Project management | |
SNS: Simple Notification Service |AWS | |
SSS: Shamir's Secret Sharing |Vault, Security, Encryption | |
T | |
*Tensor |Machine Learning | |
Multi-dimensional matrix | |
TDD: Test Driven Development |Agile, Principle, Dev | |
TPU: Tensor Processing Unit |Machine, Machine Learning | |
Two-SV: Two-Steps Verification (*2-SV) |Security, Cloud | |
U | |
DX: User eXperience |Development | |
V | |
W | |
*Wideband Delphi |Management | |
Estimation method, diverge-converge | |
WAF: Web Application Firewall |Infrastructure | |
*Watermelon Metrics |Agile, Marketing | |
vanity metrics, hype but not representative metrics | |
WSJF: Weighted Shortest Job First |Agile, SAFe | |
X | |
Y | |
YAGNI: You Ain’t Gonna Need It |Agile, Principle, DevOps | |
Z | |
*Zero Trust |Security | |
strict access controls and not trusting anyone by default, continuous validation on every stage of interaction | |
0 | |
1 | |
2 | |
2-SV: 2-Steps Verification (Two-SV) |Security, Cloud | |
3 | |
4 | |
5 | |
6 | |
7 | |
8 | |
9 |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
--- DOCKER | |
## docker in wsl/wsl2 | |
"failed to start daemon: Error initializing network controller ... iptable" | |
--> https://github.com/microsoft/WSL/issues/6655 | |
$ sudo update-alternatives --set iptables /usr/sbin/iptables-legacy | |
$ sudo update-alternatives --set ip6tables /usr/sbin/ip6tables-legacy | |
$ sudo dockerd & | |
or | |
$ sudo dockerd --debug --iptables=false & |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
--> https://github.com/awesome-selfhosted/awesome-selfhosted | |
--> https://github.com/awesome-selfhosted/awesome-selfhosted/blob/master/non-free.md | |
==================== | |
== MIDDLEWARE == | |
==================== | |
--- ARCHITECTURE | |
- OPEX | |
--- INFRA AS CODE | |
## Configuration management | |
- ANSIBLE | |
- chef | |
- puppet | |
## Infrastructure build | |
- TERRAFORM | |
- Cloud Formation | |
## Image build | |
- PACKER | |
- DOCKER | |
## Containerization | |
- DOCKER | |
- Cri-o | |
## Container orchestrator | |
- DOCKER-COMPOSE | |
- KUBERNETES | |
--- DEV | |
## SCM | |
- Git | |
- Mercurial | |
## SCM GUI | |
- BitBucket |CI/CD | |
- Github |CI/CD | |
- GITLAB |CI/CD | |
## Code quality review | |
- SONARQUBE | |
- Selenium | |
--- CODE COLLABORATION/REVIEW | |
- GITLAB | |
- Gerrit | |
- Gitea | |
--- CI/CD | |
- GITLAB-CI | |
- Jenkins | |
- Rundeck | |
--- CMDB | |
- Git (YAML) | |
- Insight (JIRA) | |
--- SECURITY | |
## Vault | |
- ansible-vault | |
- VAULT (hashicorp) | |
## WAF | |
- Wordfence (wordpress plugin) | |
--- DATABASE | |
## RDBMS | |
- MySQL | |
- PostgreSQL | |
- OracleDB | |
- CLOUDSQL | |
## NoSQL | |
- MongoDB | |
## Database config management | |
- Liquibase | |
- SonarQube | |
## Time series | |
- influxDB | |
--- SECURITY | |
## Static application security testing (SAST) | |
- Gitleaks | |
--- TEST | |
## load/performance testing | |
- Apache JMeter | |
- NeoLoad | |
--- IDE/TOOLS | |
## FOR DOCKER | |
- Portainer | |
## FOR KUBERNETES | |
- Lens | |
## IDE | |
- VSCode | |
--- IDENTITY MANAGEMENT | |
- LDAP | |
- Keycloak | |
- Auth0 | |
- Authentik | |
--- PROJECT MANAGEMENT | |
## ITIL | |
- Redmine | |
## AGILE | |
- JIRA |Ticketing | |
--- TICKETING/ HELPDESK | |
- GLPI | |
- JIRA | |
--- MONITORING | |
## Cloud provider | |
- GCP: Stackdriver | |
- AWS: | |
- Azure: | |
## Container monitor | |
- cadvisor | |
- telegraf | |
## Data | |
- Datadog | |
- Splunk | |
## Observability | |
- Kibana | |
- Grafana | |
- Dynatrace | |
--- INVENTORY | |
- GLPI |Ticketing | |
- OCS Inventory | |
--- DOCUMENTATION | |
- Wiki | |
- DokuWiki | |
- Wiki.js | |
- Alfresco | |
- ReadTheDocs | |
/SaaS/ | |
- Confluence (JIRA) | |
- SharePoint | |
--- COMMUNICATION | |
- Slack | |
- Discord | |
- Teams | |
- Postfix | |
- Zimbra |Zimbra Cloud | |
--- OPEN/PROPRIETARY | |
## Container | |
Kubernetes --> Borg |GKE, EKS, AKS | |
## Database | |
HBase |GCP BigTable | |
==================== | |
== SAAS == | |
==================== | |
--- AGILE | |
- Klaxoon | |
- Gather town | |
--- Code collaboration | |
- GITLAB | |
- Github | |
- BitBucket | |
--- CI/CD | |
- GITLAB-CI | |
- Github Action | |
- Azure DevOps | |
--- CMDB | |
- Gitlab/Github | |
- Insight (JIRA) | |
--- IDENTITY MANAGEMENT | |
## Single-Sign-On | |
- Auth0 | |
- Office365 | |
--- REGISTRY | |
## Container/Image registry | |
- Amazon ECR (Elastic Container Registry) |AWS | |
- Azure Container Registry |Azure | |
- Dockerhub | |
- Google Artifact Registry |GCP | |
- ttl.sh | |
--- SHIPPING | |
- UPS | |
- Chronopost | |
- Mondial Relay | |
--- PAYMENT | |
- Paypal | |
- Payline | |
- Adyen | |
--- TAX | |
- Avalara | |
--- TICKETING | |
## ITIL | |
- ServiceNow | |
## AGILE | |
- JIRA | |
- Trello | |
--- DNS REGISTRAR | |
- Akamai | |
- Gandi | |
- GoDaddy | |
- LWS | |
## Mail service | |
- MS Office365 | |
- Google Workspace | |
- Zimbra Cloud | |
## Mail marketing | |
- Google Workspace | |
- SFMC (SalesForce Marketing Cloud) | |
- Mailjet | |
- Mailchimp | |
- Sendgrid | |
--- DOCUMENTATION | |
- Confluence (JIRA) | |
- SharePoint | |
- ReadTheDocs | |
--- MISC | |
- Bazzarvoice | |
rating & opinion to e-commerce product |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
--- naming convention | |
TYPE_SOMETHING_NUMBER | |
ex: - a mailing list for group of mail | |
[email protected] | |
- a bucket in google storage | |
gs_PROJECT_LOCATION_NUMBER -> gs_project_ew1_1 | |
- a server compute engine | |
cmp_PROJECT_LOCATION_NUMBER -> cp_project_ew1_1 | |
## resources naming pattern | |
-> https://stepan.wtf/cloud-naming-convention/ | |
[prefix/org]-[resource_type]-[project]-[env]-[location]-[short_description]-[suffix/number] | |
--- MIGRATION | |
## cloud migration strategies | |
https://bluexp.netapp.com/blog/aws-cvo-blg-strategies-for-aws-migration-the-new-7th-r-explained | |
https://drive.google.com/file/d/16TcsxMpjL0Uda_IUj6UF0RS55Q9bW0XG/view?usp=sharing | |
+ Gartner’s 5 Rs Model | |
- Rehost (lift-and-shift) | |
Move apps to the cloud without major changes. | |
Use Case: Quick migration with minimal effort, suitable for apps compatible with cloud infrastructure (cloud native apps) | |
- Refactor (re-architecting) | |
Modify existing apps to better fit the cloud environment. | |
Use Case: Necessary for apps needing significant changes to leverage cloud benefits fully. | |
- Revise | |
Partially modify and optimize apps for the cloud, including some redesign efforts. | |
Use Case: apps requiring enhancements but not a complete overhaul. | |
- Rebuild | |
Recreate apps from scratch using cloud-native technologies. | |
Use Case: Ideal for achieving maximum performance, scalability, and agility using cloud-native features. | |
- Replace | |
Swap out existing apps with new, often SaaS-based, solutions. | |
Use Case: Best for outdated apps where a suitable cloud-based alternative exists. | |
+ AWS 6 Rs Model | |
- Rehosting (Lift-and-Shift) | |
Move apps to the cloud without significant changes. | |
Use Case: Quick and straightforward migration for compatible apps. | |
- Replatforming (Lift, Tinker, and Shift) | |
Make minor optimizations for the cloud without changing the core architecture. | |
Use Case: apps that can benefit from cloud services with minimal adjustments. | |
- Repurchasing (Drop-and-Shop) | |
Replace existing apps with cloud-based alternatives (SaaS). | |
Use Case: When suitable cloud solutions are available, reducing custom development needs. | |
- Refactoring / Re-architecting | |
Reimagine and modify apps to fully leverage cloud capabilities. | |
Use Case: apps needing significant changes to meet new business requirements. | |
- Retire | |
Shut down unnecessary apps. | |
Use Case: Legacy apps that are no longer useful, saving costs and resources. | |
- Retain (Revisit) | |
Keep certain apps on-premises due to specific needs or constraints. | |
Use Case: apps that must remain on-premises for various reasons, such as regulatory requirements. | |
+ AWS 7 Rs Model | |
- Rehosting (Lift-and-Shift) | |
Move apps to the cloud without significant changes. | |
Use Case: Quick and straightforward migration for compatible apps. | |
- Replatforming (Lift, Tinker, and Shift) | |
Make minor optimizations for the cloud without changing the core architecture. | |
Use Case: apps that can benefit from cloud services with minimal adjustments. | |
- Repurchasing (Drop-and-Shop) | |
Replace existing apps with cloud-based alternatives (SaaS). | |
Use Case: When suitable cloud solutions are available, reducing custom development needs. | |
- Refactoring / Re-architecting | |
Reimagine and modify apps to fully leverage cloud capabilities. | |
Use Case: apps needing significant changes to meet new business requirements. | |
- Retire | |
Shut down unnecessary apps. | |
Use Case: Legacy apps that are no longer useful, saving costs and resources. | |
- Retain (Revisit) | |
Keep certain apps on-premises due to specific needs or constraints. | |
Use Case: apps that must remain on-premises for various reasons, such as regulatory requirements. | |
- Relocate | |
Move entire data centers or large portions of infrastructure to the cloud with minimal changes, often involving virtual machines. | |
Use Case: When large-scale infrastructure migration is required without modifying individual apps. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
--- SYSTEM VARS | |
ARCH=$(dpkg --print-architecture) | |
KERNEL_NAME=$(uname -s) | |
KERNEL_RELEASE=$(uname -r) | |
LSB_RELEASE_OS=$(lsb_release -is) | |
LSB_RELEASE_VERSION=$(lsb_release -rs) | |
LSB_RELEASE_CODENAME=$(lsb_release -cs) | |
DIR_KEYRING="/etc/apt/keyrings" | |
DIR_GPG="/etc/apt/trusted.gpg.d" | |
--- APP | |
## dependencies | |
+ apt package | |
- apt-transport-https |code, gcloud, helm, kubectl, sublime-text | |
- bash-completion |bash, kubectl | |
- build-essential | |
- ca-certificates |docker, gcloud, kubetcl | |
- curl |docker, kubectl, teams | |
- file | |
- git | |
- gpg |code | |
- gnupg |gcloud | |
- gnome-keyring |VSCode* (https://code.visualstudio.com/docs/editor/settings-sync#_troubleshooting-keychain-issues) | |
- gnupg |docker | |
- lsb-release |docker | |
- procps | |
- python3 |ansible | |
- python3-pip |ansible | |
- vim | |
- wget |code | |
+ pip package | |
- openai>=0.18.1 |codex-cli | |
- psutil>=5.9.0 |codex-cli | |
## package manager | |
+ apt key (root:root 644) | |
- ansible | |
$ apt-key adv --keyserver keyserver.ubuntu.com --recv-keys 93C4A3FD7BB9C367 | |
- brave | |
$ curl -fsSLo ${DIR_KEYRING}/brave-browser.gpg https://brave-browser-apt-release.s3.brave.com/brave-browser-archive-keyring.gpg | |
- docker | |
$ curl -fsSL https://download.docker.com/linux/debian/gpg | sudo gpg --dearmor -o ${DIR_KEYRING}/docker.gpg | |
- hashicorp (terraform, vault, consul, packer) | |
$ wget -O- https://apt.releases.hashicorp.com/gpg | gpg --dearmor | sudo tee ${DIR_KEYRING}/hashicorp.gpg | |
- helm | |
$ curl https://baltocdn.com/helm/signing.asc | sudo gpg --dearmor -o ${DIR_KEYRING}/helm.gpg | |
- gcloud | |
$ | |
- google-cloud (gcloud, kubernetes) | |
$ curl -fsSLo ${DIR_KEYRING}/google-cloud.gpg https://packages.cloud.google.com/apt/doc/apt-key.gpg | |
- microsoft (code, teams) | |
$ curl -fsSL https://packages.microsoft.com/keys/microsoft.asc | sudo gpg --dearmor -o ${DIR_KEYRING}/packages.microsoft.gpg | |
- sublimetext | |
$ wget -qO - https://download.sublimetext.com/sublimehq-pub.gpg | sudo gpg --dearmor -o ${DIR_GPG}/sublimehq-archive.gpg | |
+ sources.list.d (*.list) | |
- ansible | |
deb http://ppa.launchpad.net/ansible/ansible/ubuntu focal main | |
- brave-browser | |
deb [signed-by=${DIR_KEYRING}/brave-browser.gpg] https://brave-browser-apt-release.s3.brave.com/ stable main | |
- chrome | |
deb [arch=${ARCH}] https://dl.google.com/linux/chrome/deb/ stable main | |
- code | |
deb [arch=${ARCH} signed-by=${DIR_KEYRING}/microsoft.gpg] http://packages.microsoft.com/repos/code stable main | |
- docker | |
deb [arch=${ARCH} signed-by=${DIR_KEYRING}/docker.gpg] https://download.docker.com/linux/debian ${LSB_RELEASE_CODENAME} stable | |
- hashicorp (terraform, packer, vault, consul) | |
deb [signed-by=${DIR_KEYRING}/hashicorp.gpg] https://apt.releases.hashicorp.com ${LSB_RELEASE_CODENAME} main | |
- helm | |
deb [arch=${ARCH} signed-by=${DIR_KEYRING}/helm.gpg] https://baltocdn.com/helm/stable/debian/ all main | |
- gcloud | |
deb [signed-by=${DIR_KEYRING}/google-cloud.gpg] https://packages.cloud.google.com/apt cloud-sdk main | |
- kubernetes | |
deb [signed-by=${DIR_KEYRING}/kubernetes.gpg] https://apt.kubernetes.io/ kubernetes-xenial main | |
- sublimetext | |
deb https://download.sublimetext.com/ apt/stable/ | |
- teams | |
deb [signed-by=${DIR_KEYRING}/microsoft.gpg] https://packages.microsoft.com/repos/ms-teams stable main | |
+ package | |
- ansible | |
- brave-browser | |
- code | |
- docker-ce docker-ce-cli containerd.io docker-buildx-plugin docker-compose-plugin | |
- docker-compose | |
- google-chrome-stable | |
- helm | |
- jq | |
- kubectl | |
- shellcheck | |
- snap | |
- sublimetext | |
- teams | |
- terraform | |
## deb | |
- wps | |
## snap | |
- skype | |
## git repo | |
- brew | |
$ /bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)" | |
- codex-cli | |
https://github.com/microsoft/Codex-CLI/blob/main/Installation.md | |
- yq | |
$ wget https://github.com/mikefarah/yq/releases/latest/download/yq_linux_amd64 -O /usr/bin/yq &&\ | |
chmod +x /usr/bin/yq | |
--- CONFIG |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
--- ACER ASPIRE 5 A514-55-51XE | |
OS: Debian 11 | |
Default kernel: 5.10 | |
DE: KDE Plasma | |
## ISSUES | |
+ with KDE Plasme DE, black screen on startup login screen, no DE | |
+ log | |
$ dmesg --level=err | |
[ 0.337937] pci 0000:00:07.0: DPC: RP PIO log size 0 is invalid | |
[ 1.544938] integrity: Problem loading X.509 certificate -65 | |
[ 1.544948] integrity: Problem loading X.509 certificate -65 | |
[ 2.267560] i915 0000:00:02.0: firmware: failed to load i915/adlp_dmc_ver2_16.bin (-2) | |
[ 2.267563] firmware_class: See https://wiki.debian.org/Firmware for information about missing firmware | |
[ 2.267567] i915 0000:00:02.0: firmware: failed to load i915/adlp_dmc_ver2_16.bin (-2) | |
[ 2.277183] i915 0000:00:02.0: firmware: failed to load i915/adlp_guc_70.1.1.bin (-2) | |
[ 2.277189] i915 0000:00:02.0: firmware: failed to load i915/adlp_guc_70.1.1.bin (-2) | |
[ 2.277194] i915 0000:00:02.0: firmware: failed to load i915/adlp_guc_69.0.3.bin (-2) | |
[ 2.277199] i915 0000:00:02.0: firmware: failed to load i915/adlp_guc_69.0.3.bin (-2) | |
[ 2.277200] i915 0000:00:02.0: GuC firmware i915/adlp_guc_70.1.1.bin: fetch failed with error -2 | |
[ 2.368614] i915 0000:00:02.0: GuC initialization failed -2 | |
[ 2.368616] i915 0000:00:02.0: Enabling uc failed (-5) | |
[ 2.368617] i915 0000:00:02.0: Failed to initialize GPU, declaring it wedged! | |
[ 6.918799] uvcvideo 1-7:1.1: Failed to query (129) UVC probe control : 26 (exp. 48). | |
[ 6.918833] uvcvideo 1-7:1.1: Failed to initialize the device (-5). | |
[ 7.257953] bluetooth hci0: firmware: failed to load rtl_bt/rtl8852bu_fw.bin (-2) | |
[ 7.257990] bluetooth hci0: firmware: failed to load rtl_bt/rtl8852bu_fw.bin (-2) | |
[ 7.258014] Bluetooth: hci0: RTL: firmware file rtl_bt/rtl8852bu_fw.bin not found | |
[ 7.507427] sof-audio-pci-intel-tgl 0000:00:1f.3: firmware: failed to load intel/sof/sof-adl.ri (-2) | |
[ 7.507471] sof-audio-pci-intel-tgl 0000:00:1f.3: firmware: failed to load intel/sof/sof-adl.ri (-2) | |
[ 7.507506] sof-audio-pci-intel-tgl 0000:00:1f.3: error: sof firmware file is missing, you might need to | |
[ 7.507538] sof-audio-pci-intel-tgl 0000:00:1f.3: download it from https://github.com/thesofproject/sof-bin/ | |
[ 7.507557] sof-audio-pci-intel-tgl 0000:00:1f.3: error: failed to load DSP firmware -2 | |
[ 7.507965] sof-audio-pci-intel-tgl 0000:00:1f.3: error: sof_probe_work failed err: -2 | |
[ 11.372617] r8169 0000:2b:00.0: firmware: failed to load rtl_nic/rtl8168h-2.fw (-2) | |
[ 11.373357] r8169 0000:2b:00.0: firmware: failed to load rtl_nic/rtl8168h-2.fw (-2) | |
[ 11.373912] r8169 0000:2b:00.0: Unable to load firmware rtl_nic/rtl8168h-2.fw (-2) | |
## KERNEL UPGRADE | |
+ add backport repo to sources.list | |
deb http://deb.debian.org/debian bullseye-backports main contrib non-free | |
deb-src http://deb.debian.org/debian bullseye-backports main contrib non-free | |
+ upgrade kernel | |
$ apt update | |
$ apt-cache policy linux-image-amd64 | |
$ apt install linux-image-amd64=6.0.12-1~bpo11+1 linux-headers-6.0.0-0.deb11.6-amd64 | |
$ reboot | |
## INSTALL DEVICES DRIVERS | |
REQUIREMENTS | |
// install firmwares from Backports | |
$ apt install -t bullseye-backports firmware-linux firmware-linux-free firmware-linux-nonfree firmware-sof-signed firmware-intel-sound firmware-misc-nonfree firmware-realtek | |
// download latest firmware from kernel git repo | |
$ git clone https://git.kernel.org/pub/scm/linux/kernel/git/firmware/linux-firmware.git /opt/linux-firmware | |
(Backup existing firmware files if needed) | |
VGA (Intel Iris Xe Graphics) | |
--> Intel Corporation Device 46a8 (rev 0c) | |
--> i915 | |
$ mv /lib/firmware/i915 /lib/firmware/i915_ORIG | |
$ cp -avr /opt/linux-firmware/i915 /lib/firmware/ | |
BLUETOOTH (Realtek) | |
--> Realtek Bluetooth Radio | |
--> rtl_bt | |
$ mv /lib/firmware/rtl_bt /lib/firmware/rtl_bt__ORIG | |
$ cp -avr /opt/linux-firmware/rtl_bt /lib/firmware/ | |
AUDIO (Intel) | |
--> Intel Corporation Device 51c8 (rev 01) | |
--> SOF Project | |
// Download the SOF firmware project binaries from repo | |
$ wget https://github.com/thesofproject/sof-bin/releases/download/v2.2.4/sof-bin-v2.2.4.tar.gz -O - |tar -xz -C /opt/ | |
$ mv /lib/firmware/intel/sof /lib/firmware/intel/sof_ORIG | |
$ mv /lib/firmware/intel/sof-tplg /lib/firmware/intel/sof-tplg_ORIG | |
$ mv /lib/firmware/intel/sof-tplg-v1.7 /lib/firmware/intel/sof-tplg-v1.7_ORIG | |
$ cd /opt/sof-bin-v2.2.4 && ./install.sh v2.2.4 | |
ETHERNET (Realtek) | |
--> Realtek Semiconductor Co., Ltd. RTL8111/8168/8411 PCI Express Gigabit Ethernet Controller (rev 15) | |
--> r8168 | |
... NO ISSUE ... | |
WIFI (Realtek) | |
--> Realtek Semiconductor Co., Ltd. Device b852Realtek Bluetooth Radio | |
--> rtw89 | |
// install requirements | |
$ apt install make gcc linux-headers-$(uname -r) build-essential git | |
// Download Realtek rtlwifi codes from project repo | |
- kernel <6.0 : https://github.com/HRex39/rtl8852be | |
- kernel >=6.0 : https://github.com/lwfinger/rtw89 | |
$ git clone https://github.com/lwfinger/rtw89.git /opt/rtw89 | |
$ cd /opt/rtw89 | |
$ make | |
$ make install | |
// copy new firmwares files from kernel repo | |
$ mv /lib/firmware/rtw89 /lib/firmware/rtw89_ORIG | |
$ cp -avr linux-firmware/rtw89 /lib/firmware/rtw89 | |
// unload/reload kernel modules | |
$ modprobe -rv rtw_8852ae | |
$ modprobe -rv rtw_core | |
$ modprobe -v rtw_8852ae | |
THUNDERBOLT USB (Intel) | |
--> Intel Corporation Device 463e (rev 04) (thunderbolt) | |
--> thunderbolt | |
... NO ISSUE .. | |
WEBCAM (ACER) | |
--> ACER HD User Facing | |
### | |
### BIG PROBLEM | |
### | |
https://www.mail-archive.com/[email protected]/msg494663.html | |
// reload kernel config | |
$ update-initramfs -u -k all | |
$ reboot |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
## ELK | |
+ apm + fleetserver |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
## PAYMENT | |
- Stripe | |
- Adyen | |
## SHIPPING | |
- Chronopost | |
- UPS | |
## TAX | |
- Avalara | |
## MONITORING | |
+ Data | |
- Datadog | |
+ APM | |
- AppDynamics |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
--- MISC | |
+ eBPF, Cilium | |
+ observability | |
--- DATA | |
+ DBT (Data Build Tools) | |
--- TERRAFORM | |
+ workspace | |
--- ANSIBLE | |
+ collection | |
--- GCP | |
+ service mesh | |
+ GKE Dataplane V2 | |
--- PROMETHEUS | |
+ node exporter |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
I&A: Inspect & Adapt | |
MMP: Minimum Marketable Product | |
MVP: Minimum Viable Product | |
PDCA: Plan Do Check Adjust | |
ROAM: Resolved Owned Accepted Mitigated | |
RPO: Recovery Point Objective | |
RTE: Release Train Engineer | |
RTO: Recovery Time Objective | |
WSJF: Weighted Shortest Job First | |
## retro | |
KISS: Keep Improve Start Stop | |
## Story/stress Points | |
-> abstract estimation | |
+ take in account | |
- difficulty | |
- effort | |
- risks | |
- doubts/incertitudes | |
- inter-dependancies | |
+ how to use | |
- use fibonacci sequence: 0, 1, 1, 2, 3, 5, 8, 13, 21, 34, 55, 89, 144, 233 | |
>8 -> to cut | |
- use t-shirt size: XS, S, M, L, XL, XXL | |
>XL -> to cut | |
- days/man: 1/2d, 1d, 2d, 3d, 4d, 5d, 6d | |
>5d -> to cut | |
## Planning Poker | |
-> product backlog refinement (grooming), take story points | |
## Grooming/refinement --> Product Backlog Refinement | |
-> review of team's backlog (backlog refinement) | |
## Extreme Quotation | |
## Burndown chart | |
## Cut-Over planning | |
Actions planning for a big change of an old technology to a new one | |
--- MEETING | |
## DSM (<=15m) | |
- what you've done, what you are doing/ will do, if there is an issue? | |
- turn by turn | |
- no debate, no problem resolution | |
- can tell if there is issues/problems -> debate after | |
(2h Ceremony) | |
## REVIEW | |
+ Sprint (<=1h20m) | |
## PLANNING | |
+ PI | |
+ Sprint (<=30m) | |
## RETRO | |
+ Sprint (<=10m) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
--- MISC --- | |
## suppress output of a task | |
- task | |
no_log: True | |
--- SPECIAL ENV VARS | |
ANSIBLE_VERBOSITY=[1-5] | |
ANSIBLE_DEBUG=[1|0] | |
--- ISSUE --- | |
## SHORT-SHA miss-interpreted | |
https://stackoverflow.com/questions/59066160/issue-deploying-to-appengine-flex-from-cloud-build | |
--- PLUGINS --- | |
## gcp plugin for ansible | |
--> inventory.gcp.yml | |
plugin: gcp_compute | |
projects: | |
- gcp_project | |
auth_kind: serviceaccount | |
service_account_file: '/path/to/service-account.json' | |
--> ansible.cfg | |
[inventory] | |
enable_plugins = gcp_compute | |
--- BLOCK --- | |
- name: Multiple task in a block | |
block: | |
- name: 1st task | |
module: | |
param: value | |
- name: 2nd task | |
module: | |
param: value | |
--- SERVICES ---- | |
- name: Get service facts & check if a service exist | |
block: | |
- name: Get services facts | |
service_facts: | |
- name: Check if my_service is installed | |
fail: | |
msg: my_service is not installed, install it before | |
when: ansible_facts.services["my_service.service"] is not defined | |
--- COLLECTION --- | |
## install collection | |
$ ansible-galaxy collection install --collections-path ~/.ansible/collections ansible.posix | |
requirements.yml | |
``` | |
collections: | |
- community.general | |
- ansible.posix | |
``` | |
$ ansible-galaxy collection install --force --collections-path /root/.ansible/collections -r requirements.yml |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
## disable directory listing | |
<Directory /var/www/public_html> | |
Options -Indexes | |
</Directory> | |
## force redirect non-www to https://wwww | |
RewriteEngine On | |
RewriteCond %{HTTP_HOST} !^www\. [NC] | |
RewriteRule ^(.*)$ https://www.%{HTTP_HOST}%{REQUEST_URI} [R=301,L] | |
## force redirect http|www|non-www to https|non-www | |
RewriteEngine On | |
RewriteCond %{SERVER_NAME} =DOMAIN.COM [OR] | |
RewriteCond %{SERVER_NAME} =WWW.DOMAIN.COM | |
RewriteRule ^ https://DOMAIN.COM%{REQUEST_URI} [END,NE,R=permanent] |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Domain Driven Design (DDD) | |
Loose coupling and API-driven architecture | |
Cloud-native architecture | |
Microservices and containerization | |
Serverless architecture | |
Strangler pattern |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
## Move to cloud strategies | |
- Rehost/ Lift and Shift (IaaS) | |
rehost application in another hardware environment without changing architecture. | |
- Refactor (PaaS) | |
move apps on a cloud infrastructure | |
- Rebuild | |
throw out the code for an existing app and rearchitect it | |
- Replace (SaaS) | |
discard existing application set and adopt commercial software delivered as a service |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
--- deployment | |
## deployment pattern | |
+ canary | |
+ dark launch | |
+ blue/green |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
## reverse lookup | |
$ dig -x IP_ADDRESS |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
--- BUILD --- | |
## build image | |
$ docker build -t IMAGE_NAME:TAG . | |
$ docker build -t IMAGE_NAME:TAG -f /PATH/TO/DOCKERFILE | |
--- DOCKER-COMPOSE --- | |
## ERROR: Error creating container: UnixHTTPConnectionPool(host='localhost', port=None): Read timed out. (read timeout=60) | |
increase timeout | |
$ COMPOSE_HTTP_TIMEOUT=200 docker-compose up | |
--- FILE --- | |
## copy file | |
+ host to container | |
$ docker cp FILE_ON_HOST CONTAINER_ID:FILE | |
+ container to host | |
$ docker cp CONTAINER_ID:FILE FILE_ON_HOST | |
--- IMAGE --- | |
## rename image | |
$ docker tag CURRENT_IMAGE_NAME:TAG NEW_IMAGE_NAME:TAG | |
--- RUN --- | |
## run container | |
$ docker run --name CONTAINER_NAME -p LOCAL_PORT:CONTAINER_PORT -e ENV_VAR=VALUE IMAGE:TAG |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
--- pure-ftpd --- | |
## Change password | |
$ sudo pure-pw passwd FTP_USER | |
$ sudo pure-pw mkdb |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
--- HISTORY --- | |
Basic principle: | |
- collect data | |
- stock data | |
- process data with compute power | |
Main type of compute power: | |
- network | |
- storage | |
- compute engine | |
## Google technology timeline | |
2002: GFS (Filesystem/ OpenSource: HDFS) | |
2004: MapReduce (Filesystem over datacenter) | |
2006: BigTable (NoSQL/ OpenSource: HBase) | |
2006: Borg (Container orchestrator/ OpenSource: Kubernetes) | |
2008: Dremel (Datawarehouse/ Origin of BigQuery) | |
2010: Colossus | |
2012: Flumejava | |
2013: Spanner | |
2015: Kubernetes | |
2016: Tensorflow | |
## GCP products categories | |
+ Large public, devices & productivity/collaboration/communication | |
-> Gmail, Drive, Android | |
+ Legacy integrations (expose API, connect business platform) | |
-> front app, apigee, vision/language API | |
+ Core apps (accelerate app delivery) | |
-> app dev & management, AppEngine, ContainerEngine | |
+ Data in silos (EDW, Hadoop) | |
-> Data analysis, ML, BigQuery, Dataflow | |
+ Private data center (Virtualization) (full managed operations) | |
-> infrastructure, compute/container, storage, network | |
## Cloud computing services | |
--- | |
+ Compute | |
- Compute Engine |Virtual server | |
- Kubernetes Engine |Container orchestrator cluster | |
- Cloud Run |Container serverless | |
- App Engine |Application runner | |
- Cloud Functions |Simple code runner | |
+ Networking | |
- Cloud Virtual Network | |
- Cloud interconnect | |
- Cloud DNS | |
- Load Balancing | |
- Cloud CDN | |
+ Big Data | |
- BigQuery |stock large data in database | |
- Cloud Pub/Sub |async messaging bus, ~Kafka | |
- Cloud Dataflow |transform large data | |
- Cloud Dataproc |data science | |
- Cloud Datalab |data science | |
+ Machine Learning | |
- Cloud Machine Learning |TPU | |
- Vision API | |
- Speech API | |
- Translate API | |
- Natural Language API | |
+ Storage | |
- Cloud Storage |large object storage | |
- Cloud SQL |relational database | |
- Cloud Spanner |globally distributed/scalable sql/transaction db | |
- Cloud Datastore |NoSQL database | |
- Cloud Bigtable |high-performance NoSQL database (for ML) | |
+ Database | |
+ Analytics | |
+ AI | |
---- | |
+ Security | |
+ Tooling | |
+ API | |
--- MISC --- | |
## filter | |
Get only a value | |
-> internal IP | |
$ gcloud compute instances list --filter="name~'${NAME}' status=RUNNING" --format="value(networkInterfaces[0].networkIP)" | |
-> public IP | |
$ gcloud compute instances list --filter="name~'${NAME}' status=RUNNING" --format="value(networkInterfaces[0].accessConfigs[0].natIP)" | |
In a formatted table (box) with sort | |
$ gcloud compute instances list --format="table[box](name,creationTimestamp:sort=1,status,metadata.items.version,metadata.items.sha1,INTERNAL_IP)" --filter="name~'hylo' status=RUNNING" | |
Get according to datetime (creationTimestamp) | |
--> https://cloud.google.com/sdk/gcloud/reference/topic/datetimes | |
$ gcloud *** list --filter="creationTimestamp<=-pXyYmZdAm" | |
-p: beyond | |
+p: from now | |
## get startupscript log via instance serial port | |
$ gcloud compute instances get-serial-port-output INSTANCE --zone=ZONE |awk '/GCEMetadataScripts/{print substr($0, index($0, $9))}' | |
--- ENVIRONMENT VARIABLE | |
Region: $REGION | |
Project ID: $PROJECT_ID | |
Project number: $PROJECT_NUMBER | |
Zone: $ZONE | |
Service Account: $SERVICE_ACCOUNT | |
Service Account mail: $SERVICE_ACCOUNT_EMAIL | |
--- PROJECT --- | |
## usage | |
+ Global ressources collection/organization | |
- track resource/quota usage | |
- enable/disable service/API | |
- control permissions/credentials | |
- set billing account (can be associated with one/more projects) | |
- any services is associated with one and only project | |
+ isolation between ressources | |
- explicit trust across project can be created | |
## GCP ressources hierarchy level | |
--> https://cloud.google.com/blog/products/gcp/mapping-your-organization-with-the-Google-Cloud-Platform-resource-hierarchy | |
[ORG] | |
{Company} | |
[FOLDERS] | |
(Dept1) (Dept2) (SharedInfra) | |
<TeamA> <TeamB> | |
"Product1" "Product2" | |
[PROJECTS] | |
'ProjectA' 'Project2' | |
[RESSOURCES] | |
-VM -storage | |
## set default region/zone | |
$ gcloud compute project-info add-metadata --metadata google-compute-default-region=REGION,google-compute-default-zone=ZONE | |
--- IAM --- | |
## get role/permissions of a service account | |
$ gcloud projects get-iam-policy PROJECT_ID --flatten="bindings[].members" --format='table(bindings.role)' --filter="bindings.members:SERVICE_ACCOUNT_EMAIL" | |
$ gcloud asset search-all-iam-policies --query='policy:"serviceAccount:SERVICE_ACCOUNT_EMAIL"' | |
## get Cloud Storage service agent account | |
$ gcloud storage service-agent --project=PROJECT_ID | |
or | |
$ gsutil kms serviceaccount -p PROJECT_ID | |
## List of all default Service Accounts | |
- App Engine: [email protected] | |
- Compute Engine: [email protected] | |
- Cloud Functions: [email protected] | |
- Cloud Run (fully managed): [email protected] | |
- Cloud Build: [email protected] | |
- Google Kubernetes Engine (GKE): [email protected] | |
- AI Platform: [email protected] | |
- Dataprep: [email protected] | |
- Dataflow: PROJECT_NUMBER@dataflow-service-producer-prod.iam.gserviceaccount.com | |
- BigQuery Data Transfer: [email protected] | |
- Pub/Sub: [email protected] | |
--- SECRET MANAGER --- | |
## show secret | |
$ gcloud secrets versions access latest --secret SECRET_NAME --project PROJECT_NAME | |
--- GOOGLE STORAGE --- | |
## mount bucket to instance dir | |
$ gcsfuse -o allow_other,nonempty --file-mode 755 --dir-mode 755 --gid XYZ --uid XYZ --key-file /PATH/TO/KEY.json BUCKET_NAME /MOUNT/POINT | |
in fstab: | |
BBUCKET_NAME /MOUNT/POINT gcsfuse rw,noauto,user,key_file=/PATH/TO/KEY.json | |
unmount: | |
$ fusermount -u /MOUNT/POINT | |
or | |
$ sudo umount /MOUNT/POINT |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
--- MISC --- | |
## execute dry-run command | |
$ git cmd --dry-run | |
## remove tracked dir/file on remote but keep on local | |
$ git rm --cached <file-name> | |
$ git rm -r --cached <folder-name>git checkout -b subbranch_of_b1 branch1 | |
## prevent git from detecting changes in some files | |
$ git update-index --assume-unchanged | |
## get branch name via cmd | |
$ LONG_SHA=$(git rev-parse HEAD) | |
$ BRANCH_NAME=$(sed -nE "/$LONG_SHA/s/.*refs\/heads\/(.*)/\1/p" <<< "$(git ls-remote -q)") | |
## set remote URL | |
SSH | |
$ git remote set-url origin git@DOMAIN:PROJECT/REPOSITORY.git | |
HTTPS | |
$ git remote set-url origin https://DOMAIN/PROJECT/REPOSITORY.git | |
## set pre-commit | |
-> https://github.com/pre-commit/pre-commit | |
## BISECT: debug commit by commit | |
$ git bisect start | |
if bad | |
$ git bisect bad | |
if good | |
$ git bisect good | |
$ git bisect run | |
$ git bisect log | |
$ git bisect skip | |
--- ISSUE | |
## Gitlab Token changed, reload local repo with new token | |
"remote: HTTP Basic: Access denied. The provided password or token is incorrect or your account has 2FA enabled and you must use a personal access token instead of a password. See https://gitlabee.dt.renault.com/help/topics/git/troubleshooting_git#error-on-git-fetch-http-basic-access-denied | |
fatal: Authentication failed for..." | |
$ git remote set-url origin https://USER:GITLAB_TOKEN@GITLAB_URL/REPO.git | |
--- CONFIG --- | |
## enable color | |
$ git config --global color.ui true | |
--- COMMIT --- | |
## verify and select commits | |
$ git add -p | |
## undo git add | |
$ git reset | |
## ammend to the previous commit | |
Commit without edit message | |
$ git commit --amend --no-edit | |
Change commit message in previous commits | |
$ git rebase -i HEAD~n | |
--> reword SHA1 | |
$ git commit --ammend -m "NEW MESSAGE" | |
## change author of commit | |
Set the correct author for current Git repo then | |
$ git rebase SRC_BRANCH --exec "git commit --amend --reset-author --no-edit" | |
or | |
$ git rebase --onto HEAD~N --exec "git commit --amend --reset-author --no-edit" HEAD~N | |
## revert file to a state from other branch | |
$ git checkout BRANCH -- PATH/TO/FILE | |
## get long sha1 of head | |
$ git rev-parse HEAD | |
## look for branches containing a commit | |
$ git branch (--remote) --contains COMMIT_SHA | |
## commit format | |
+ Default format | |
TYPE(SCOPE): SUBJECT | |
OPTIONAL BODY | |
OPTIONAL FOOTER | |
+ Merge commit | |
Merge branch 'BRANCH_NAME' | |
+ Type | |
- feat: adds a new feature | |
- fix: fixes a bug | |
- refactor: rewrite/restructure code, however does not change any behaviour | |
- perf: special refactor commits, that improve performance | |
- style: do not affect the meaning (white-space, formatting, missing semi-colons, etc) | |
- test: add missing tests or correcting existing tests | |
- docs: affect documentation only | |
- build: affect build components like build tool, ci pipeline, dependencies, project version, ... | |
- ops: affect operational components like infrastructure, deployment, backup, recovery, ... | |
- chore: miscellaneous commits e.g. modifying .gitignore | |
+ Breaking Changes Indicator "!" | |
TYPE(SCOPE)!: SUBJECT | |
--- TAG --- | |
$ git tag TAG BRANCH | |
$ git push origin TAG | |
## merge tag to branch | |
$ git checkout BRANCH | |
$ git merge vTAG | |
--- FIXUP --- | |
## fixup anterior commit | |
$ git commit --fixup=SHA1 | |
$ git rebase -i --autosquash SHA1~1 | |
--- BRANCH --- | |
## create branch from another branch | |
$ git checkout branch_orig | |
$ git checkout -b feature/branch_dest branch_orig | |
## diff/compare branches | |
$ git diff BRANCHE1..BRANCHE2 | |
## rename branch | |
$ git branch -m NEW_NAME | |
$ git push origin :OLD_NAME NEW_NAME | |
$ git push origin -u NEW_NAME | |
--- REBASE --- | |
## rebase feat branch from an origin branch | |
$ git rebase -i master | |
--- RESTORE --- | |
## restore file to a commit | |
$ git checkout COMMID_ID -- FILE | |
## restore file to a branch | |
$ git restore --source origin/BRANCH FILE | |
--- CHERRY-PICK --- | |
## cherry pick commit from other branch | |
$ git checkout TARGET_BRANCH | |
$ git cherry-pick SHA1 | |
--- LOG --- | |
## log of single file | |
$ git glog --follow -p PATH/TO/FILE | |
## log overiew of all commits and changes | |
$ git log --patch | |
## REFLOG (reference logger) log all commands | |
$ git reglog show BRANCH | |
--- MERGE --- | |
## manual merge local/origin branch to master | |
Fetch and check out the branch (create it on local if not exist) | |
$ git fetch origin | |
$ git checkout -b "feat/MY_BRNACH" "origin/feat/MY_BRANCH" | |
Review the changes locally | |
Merge the branch and fix any conflicts that come up | |
$ git fetch origin | |
$ git checkout "master" | |
$ git merge --no-ff "feat/MY_BRANCH" | |
Push | |
$ git push origin "master" | |
--- RESET --- | |
## reset to origin branch | |
$ git reset --hard origin/BRANCH | |
--- STASH ---- | |
## stash workflow | |
Save | |
$ git stash save | |
or | |
git stash save "WIP MESSAGE" | |
Include untracked files | |
$ git stash save --include-untracked | |
List | |
$ git stash list | |
Show desc | |
$ git stash show stash@{0} | |
Get stashed work (and delete ref) | |
$ git stash pop | |
or | |
$ git stash pop stash@{0} | |
Get but don't delete ref | |
$ git stash apply | |
--- CLEAN --- | |
## clean repo directory, remove untracked files | |
$ git clean | |
-n: dry-run | |
-f: force | |
-i: interactive | |
-d: directory+files | |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
------ GITLAB-CI ------ | |
## scheduled job | |
--- | |
JOB:on-schedule: | |
stage: STAGE | |
variables: | |
VARS: VALUE | |
script: | |
- CMD | |
only: | |
- schedules | |
--- | |
## templating bloc | |
--- | |
.TEMPLATE_NAME | |
bloc: | |
key: value | |
JOB: | |
extend: | |
- .TEMPLATE_NAME | |
--- |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
## 1xx INFORMATIONAL | |
100 - Continue - Client should continue with request. | |
101 - Switching Protocols - Server is switching protocols. | |
102 - Processing - Server has received and is processing the request. | |
103 - Processing - Server has received and is processing the request. | |
122 - Request-uri too long - URI is longer than a maximum of 2083 characters. | |
## 2xx SUCCESS | |
200 - Ok - The request was fulfilled. | |
201 - Created - Following a POST command, this indicates success, but the textual part of the response line indicates the URI by which the newly created document should be known. | |
202 - Accepted - The request has been accepted for processing, but the processing has not been completed. The request may or may not eventually be acted upon, as it may be disallowed when processing actually takes place. there is no facility for status returns from asynchronous operations such as this. | |
203 - Partial Information - When received in the response to a GET command, this indicates that the returned metainformation is not a definitive set of the object from a server with a copy of the object, but is from a private overlaid web. This may include annotation information about the object, for example. | |
204 - No Response - Server has received the request but there is no information to send back, and the client should stay in the same document view. This is mainly to allow input for scripts without changing the document at the same time. | |
205 - Reset Content - Request processed, no content returned, reset document view. | |
206 - Partial Content - partial resource return due to request header. | |
207 - Multi-Status - XML, can contain multiple separate responses. | |
208 - Already Reported - results previously returned. | |
226 - Im Used - request fulfilled, reponse is instance-manipulations. | |
## 3xx REDIRECTION | |
301 - Moved - The data requested has been assigned a new URI, the change is permanent. (N.B. this is an optimisation, which must, pragmatically, be included in this definition. Browsers with link editing capabiliy should automatically relink to the new reference, where possible) | |
302 - Found - The data requested actually resides under a different URL, however, the redirection may be altered on occasion (when making links to these kinds of document, the browser should default to using the Udi of the redirection document, but have the option of linking to the final document) as for "Forward". | |
303 - Method - Like the found response, this suggests that the client go try another network address. In this case, a different method may be used too, rather than GET. | |
304 - Not Modified - If the client has done a conditional GET and access is allowed, but the document has not been modified since the date and time specified in If-Modified-Since field, the server responds with a 304 status code and does not send the document body to the client. | |
305 - Use Proxy - Content located elsewhere, retrieve from there. | |
306 - Switch Proxy - Subsequent requests should use the specified proxy. | |
307 - Temporary Redirect - Connect again to different URI as provided. | |
308 - Permanent Redirect - Connect again to a different URI using the same method. | |
## 4xx CLIENT SIDE ERRORS | |
400 - Bad Request - The request had bad syntax or was inherently impossible to be satisfied. | |
401 - Unauthorized - The parameter to this message gives a specification of authorization schemes which are acceptable. The client should retry the request with a suitable Authorization header. | |
402 - Payment Required - The parameter to this message gives a specification of charging schemes acceptable. The client may retry the request with a suitable ChargeTo header. | |
403 - Forbidden - The request is for something forbidden. Authorization will not help. | |
404 - Not Found - The server has not found anything matching the URI given. | |
405 - Method Not Allowed - Request method not supported by that resource. | |
406 - Not Acceptable - Content not acceptable according to the Accept headers. | |
407 - Proxy Authentication Required - Client must first authenticate itself with the proxy. | |
408 - Request Timeout - Server timed out waiting for the request. | |
409 - Conflict - Request could not be processed because of conflict. | |
410 - Gone - Resource is no longer available and will not be available again. | |
411 - Length Required - Request did not specify the length of its content. | |
412 - Precondition Failed - Server does not meet request preconditions. | |
413 - Request Entity Too Large - Request is larger than the server is willing or able to process. | |
414 - Request URI Too Large - URI provided was too long for the server to process. | |
415 - Unsupported Media Type - Server does not support media type. | |
416 - Requested Rage Not Satisfiable - Client has asked for unprovidable portion of the file. | |
417 - Expectation Failed - Server cannot meet requirements of Expect request-header field. | |
418 - I'm a teapot - I'm a teapot. | |
420 - Enhance Your Calm - Twitter rate limiting. | |
421 - Misdirected Request - Server is not able to produce a response. | |
422 - Unprocessable Entity - Request unable to be followed due to semantic errors. | |
423 - Locked - Resource that is being accessed is locked. | |
424 - Failed Dependency - Request failed due to failure of a previous request. | |
426 - Upgrade Required - Client should switch to a different protocol. | |
428 - Precondition Required - Origin server requires the request to be conditional. | |
429 - Too Many Requests - User has sent too many requests in a given amount of time. | |
431 - Request Header Fields Too Large - Server is unwilling to process the request. | |
444 - No Response - Server returns no information and closes the connection. | |
449 - Retry With - Request should be retried after performing action. | |
450 - Blocked By Windows Parental Controls - Windows Parental Controls blocking access to webpage. | |
451 - Wrong Exchange Server - The server cannot reach the client's mailbox. | |
499 - Client Closed Request - Connection closed by client while HTTP server is processing. | |
## 5xx SERVER SIDE ERROR | |
500 - Internal Error - The server encountered an unexpected condition which prevented it from fulfilling the request. | |
501 - Not Implemented - The server does not support the facility required. | |
502 - Service temporarily overloaded - The server cannot process the request due to a high load (whether HTTP servicing or other requests). The implication is that this is a temporary condition which maybe alleviated at other times. | |
503 - Gateway timeout - This is equivalent to Internal Error 500, but in the case of a server which is in turn accessing some other service, this indicates that the respose from the other service did not return within a time that the gateway was prepared to wait. As from the point of view of the clientand the HTTP transaction the other service is hidden within the server, this maybe treated identically to Internal error 500, but has more diagnostic value. | |
504 - Gateway Timeout - Gateway did not receive response from upstream server. | |
505 - Http Version Not Supported - Server does not support the HTTP protocol version. | |
506 - Variant Also Negotiates - Content negotiation for the request results in a circular reference. | |
507 - Insufficient Storage - Server is unable to store the representation. | |
508 - Loop Detected - Server detected an infinite loop while processing the request. | |
509 - Bandwidth Limit Exceeded - Bandwidth limit exceeded. | |
510 - Not Extended - Further extensions to the request are required. | |
511 - Network Authentication Required - Client needs to authenticate to gain network access. | |
598 - Network Read Timeout Error - Network read timeout behind the proxy. | |
599 - Network Connect Timeout Error - Network connect timeout behind the proxy. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
--> https://jinja.palletsprojects.com/en/3.0.x/templates/#jinja-filters.indent | |
## condition | |
{% if VAR == VALUE %} or {% if VAR is defined %} | |
PUT THIS | |
{% endif %} | |
## indent multiline var | |
vars: | |
level_1: | |
level_2: | | |
- list_1 | |
- list_2 | |
--- | |
main: | |
- this_is_a_list: | |
{{ level_1.level_2 | indent( width=4, first=False) }} | |
--- | |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
--- DOC --- | |
+ node: physical/virtual server where run k8s (master, workers) | |
+ pod: instance, set of containers | |
+ service: make pod communication | |
clusterIP | |
nodePort | |
+ volume: stockage for pods, shared by containers | |
persistent | |
nonPersistent | |
+ deployment: manage deployment | |
+ namespace: subset of cluster, isolating set of service | |
--- MISC --- | |
## run shell on container | |
$ kubectl exec --stdin --tty CONTAINER -- /bin/bash | |
## restart an workload/deployment | |
$ kubectl rollout restart deployment DEPLOYMENT | |
--- SECRET ---- | |
## declaration | |
apiVersion: v1 | |
kind: Secret | |
metadata: | |
name: SECRET-NAME | |
Data: | |
KEY1: VALUE | |
KEY2: VALUE | |
## usage | |
spec: | |
containers: | |
- name: NAME | |
image: IMAGE | |
env: | |
- name: VAR_NAME | |
valueFrom: | |
secretKeyRef: | |
name: SECRET-NAME | |
key: KEY_NAME | |
## type | |
- Data: base64 | |
- stringData: plain text | |
--- ISSUE --- | |
## pv stuck terminating | |
$ kubectl patch pv PV_NAME -p '{"metadata":{"finalizers":null}}' | |
or | |
$ kubectl edit pv PV_NAME | |
DELETE -> finalizers: - kubernetes.io/pv-protection | |
$ kubectl delete pv PV_NAME --grace-period=0 --force |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
--- OPEN SOURCE --- | |
## Apache | |
## Artistic License | |
## BSD | |
## Common Development and Distribution License (CDDL) | |
## CREATIVE COMMONS (CC) | |
Licenses offering creators flexibility in sharing their work with the world while allowing them to choose the level of control they want over how their work is used, attributed, and shared | |
https://creativecommons.org/share-your-work/cclicenses/ | |
+ CC BY (Attribution): | |
allows others to distribute, remix, adapt, and build upon your work, even for commercial purposes, as long as they give you appropriate credit. | |
This is the most permissive CC license. | |
+ CC BY-SA (Attribution-ShareAlike): | |
Similar to CC BY but requires any derivative works to be licensed under the same terms, | |
ensuring that subsequent creations remain open and shareable. | |
+ CC BY-NC (Attribution-NonCommercial): | |
permits others to use and modify your work for non-commercial purposes, as long as they provide proper attribution. | |
Commercial use is not allowed. | |
+ CC BY-ND (Attribution-NoDerivatives): | |
others can reuse your work for any purpose, even commercially, but they can't make any modifications or derivative works. | |
They must credit you as the creator. | |
+ CC BY-NC-SA (Attribution-NonCommercial-ShareAlike): | |
combines the requirements of CC BY-NC and CC BY-SA, | |
allowing for non-commercial use and requiring derivative works to be licensed under the same terms. | |
+ CC BY-NC-ND (Attribution-NonCommercial-NoDerivatives): | |
the most restrictive CC license. | |
allows others to download your work and share it with proper attribution, but they can't change it or use it for commercial purposes. | |
+ CC 0 (No Rights Reserved): | |
not technically a license but a dedication to the public domain. | |
waive all your copyright and related rights, effectively placing the work in the public domain | |
## Eclipse Public License (EPL) | |
## GNU General Public License (GPL): | |
## GNU Lesser General Public License (LGPL) | |
## MIT | |
## Mozilla Public License (MPL) | |
--- CLOSED SOURCE --- | |
--- COMMERCIAL --- |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
--- MISC --- | |
## error! | |
>> 1418 (HY000) at line 10185: This function has none of DETERMINISTIC, NO SQL, or READS SQL DATA in its declaration and binary logging is enabled (you *might* want to use the less safe log_bin_trust_function_creators variable) | |
1- Execute: | |
SET GLOBAL log_bin_trust_function_creators = 1; | |
or | |
2- Add to mysql.ini config file: | |
log_bin_trust_function_creators = 1; | |
## show mysqld startup config | |
> SHOW VARIABLES; | |
> SHOW VARIABLES LIKE '%innodb_log_buffer_size%'; | |
--- MYSQL DUMP --- | |
## error mysqldump! | |
>> Unknown table 'COLUMN_STATISTICS' in information_schema | |
Flag "COLUMN_STATISTICS" is enabled by default in mysqldump 8 | |
Run mysqldump with additional parameter: | |
$ mysqldump --column-statistics=0 | |
Make change permanent: | |
--> ~/.my.cnf | |
[mysqldump] | |
column-statistics=0 | |
--- TRANSACTION MODE --- | |
## start | |
START TRANSACTION; | |
CMD_1; | |
CMD_N; | |
... | |
## commit | |
COMMIT; | |
## rollback | |
ROLLBACK; | |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
--- MISC --- | |
## tcpdump | |
show interface | |
$ tcpdump -D | |
on specific interface | |
$ tcpdump -i INTERFACE | |
on specific port | |
$ tcpdump port PORT -i INTERFACE | |
capture only N number of packets | |
$ tcpdump -c N -i INTERFACE | |
display captured packets in ASCII | |
$ tcpdump -A -i INTERFACE | |
display Captured Packets in HEX and ASCII | |
$ tcpdump -XX -i INTERFACE | |
capture and save packets in a file | |
$ tcpdump -w FILE.pcap -i INTERFACE | |
read captured packets file | |
$ tcpdump -r FILE.pcap | |
capture IP address packets | |
$ tcpdump -n -i INTERFACE | |
capture only TCP Packets | |
$ tcpdump -i INTERFACE tcp | |
## netcat | |
+ Transfert file | |
--> on receiver | |
$ nc -l -p PORT > FILE | |
--> on sender | |
$ nc -w 3 DEST_IP PORT < FILE | |
+ Compress file & transfert | |
--> on receiver | |
$ nc -l -p PORT | uncompress -c | tar xvfp - | |
--> on sender | |
$ tar cfp - DIR | compress -c | nc -w 3 DEST_IP PORT | |
+ Reverse shell with root prilileges | |
--> target | |
$ ncat -l PORT -e /bin/bash -v -w 5000ms | |
--> source | |
$ ncat 127.0.0.1 PORT -v | |
+ Test open port | |
$ nc -vn IP PORT | |
$ nc -vnz -w IP RANGE-IP | |
--- WIFI --- | |
## scan wifi | |
$ iwlist INTERFACE scan | |
## connect to wifi via cli | |
$ | |
$ nmcli dev wifi connect SSID password PASSWORD | |
--- IP ADDRESS --- | |
## address classes | |
A: 0.0.0.0 - 126.255.255.255 (private: 10.0.0.0 - 10.255.255.255) | |
B: 128.0.0.0 - 191.255.255.255 (private: 172.16.0.0 - 172.31.255.255) | |
C: 192.0.0.0 - 223.255.255.255 (private: 192.168.1.0 - 192.168.255.255) | |
D: 224.0.0.0 - 239.255.255.255 (for multicast) | |
E: 240.0.0.0 - 255.255.255.255 (reserved by IETF) | |
127.0.0.0: for local loop | |
0.0.0.0: default route | |
-- IPV6 --- | |
## disable IPv6 | |
$ sudo cat > /etc/sysctl.d/70-disable-ipv6.conf | |
net.ipv6.conf.all.disable_ipv6 = 1 | |
$ sudo sysctl -p -f /etc/sysctl.d/70-disable-ipv6.conf |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
--- MISC --- | |
## misc | |
Check syntax | |
$ nginx -t | |
Dump all config | |
$ nginx -T | |
Reload conf | |
$ nginx -s reload | |
--- BUFFER --- | |
## tuning buffer | |
ERROR: upstream sent too big header while reading response header from upstream | |
--> https://www.getpagespeed.com/server-setup/nginx/tuning-proxy_buffer_size-in-nginx | |
proxy_busy_buffers_size 24k; | |
proxy_buffers 64 4k; | |
proxy_buffer_size 16k; |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
## error: ngrok - install failed [Error: EACCES: permission denied | |
--> https://github.com/bubenshchykov/ngrok#usage | |
$ npm install --unsafe-perm -g ngrok |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
--- MISC --- | |
## Do not remove instance when packer build fails | |
$ packer build -on-error=abort APP.json |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
--- JQ --- | |
## get all paths | |
$ jq -r '[paths | join(".")]' FILE.json | |
## get all paths and its value | |
$ jq --stream -r 'select(.[1]|scalars!=null) | "\(.[0]|join(".")): \(.[1]|tojson)"' FILE.json | |
$ jq -r 'paths(scalars) as $p | [ ( [ $p[] | tostring ] | join(".") ), ( getpath($p) | tojson )] | join(": ")' FILE.json |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
FORMATION | |
RTE: | |
Release Train Engineer | |
STE: | |
Solution Train Engineer |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
--- MISC --- | |
## copy/move all files except some | |
$ cp|mv !(FILE_1|FILE_N) DEST/ | |
## list all functions defined in shell | |
See list | |
$ declare -F | |
$ compgen -A function | |
See list of function + aliases | |
$ compgen -a -A function | |
See list with content | |
$ declare -f | |
See content of a fuction | |
$ type FUNCTION | |
## redirect cmd output/errors to file | |
$ CMD --params 2>&1 | tee /path/to/file.log | |
## chaining command with list output | |
$ CMD_1_OUTPUT_LIST | xargs -I {} CMD_2_PROCESS_EVERY_ITEM_OF_THE_LIST | |
## update-alternatives | |
Install | |
$ update-alternatives --install /path/to/link name /path/to/binary [priority] | |
$ update-alternatives --install /usr/bin/python python /usr/bin/python2.7 2 | |
$ update-alternatives --install /usr/bin/python python /usr/bin/python3.7 3 | |
Select | |
$ update-alternatives --config python | |
## XML lint, formatting | |
in package: libxml2-utils | |
$ xmllint --format UGLY.xml --output FORMATTED.xml | |
## sudo without password | |
$ visudo | |
%sudo ALL=(ALL) NOPASSWD:ALL | |
## autocompletion | |
--> in bashrc | |
if [ -f /etc/bash_completion ]; then | |
. /etc/bash_completion | |
fi | |
## add timestamp to bash_history | |
$ export HISTTIMEFORMAT="%F %T " | |
# cat > /etc/profile.d/timestamp-history.sh | |
HISTTIMEFORMAT="%F %T " | |
--- ARCHIVE --- | |
## zip multiple files in one archive | |
$ zip ARCHIVE.zip FILE1 FILE2 FILEN $FILE* | |
$ zip -r ARCHIVE.zip DIR/ | |
## zip/unzip with password | |
$ zip -e -P "PASSWORD" FILE FILE.zip | |
$ zip -e -P "PASSWORD" FILE FILE.zip | |
## list file in zip | |
$ unzip -l oc-4.13.0.zip "*/" | |
$ zipinfo -1 oc-4.13.0.zip "*/" | |
--- SERVICE --- | |
## /etc/init.d | |
$ sudo vim /etc/init.d/SERVICE | |
--> https://gist.githubusercontent.com/miromannino/a17f3e6f3fdcb4d94a1f/raw/e9bc2a8179860701224be896b420a635454dd3bd/service.sh | |
$ sudo chmod a+x /etc/init.d/SERVICE | |
$ sudo update-rc.d SERVICE defaults | |
--- LOGIN MESSAGE | |
## MOTD: login banner message for system-wide announcements | |
/etc/motd | |
## HUSHLOGIN: login banner message for individual user (suppress motd) | |
~/.hushlogin | |
--- HARDWARE --- | |
## check hardware device informations | |
List all | |
$ lspci -s | |
ID Type: Fabricant, Version | |
0000:2a:00.0 Network controller: Realtek Semiconductor Co., Ltd. Device b852 | |
Show details info about a device | |
$ lspci -v -s ID | |
--- FILE --- | |
## delete line from N file to the end | |
$ sed -i 'N,$d' FILE | |
--- DESKTOP MANAGER --- | |
## change wallpaper | |
KDE | |
$ qdbus org.kde.plasmashell /PlasmaShell org.kde.PlasmaShell.evaluateScript 'var allDesktops = desktops();print (allDesktops);for (i=0;i<allDesktops.length;i++) {d = allDesktops[i];d.wallpaperPlugin = "org.kde.image";d.currentConfigGroup = Array("Wallpaper", "org.kde.image", "General");d.writeConfig("Image", "file:///PATH/TO/FILE.png")}' | |
--- SERVICES --- | |
## check service journalctl | |
$ journalctl -u SERVICE.service | |
--- PACKAGE MANAGER --- | |
## check package who provided a CMD/file | |
$ dpkg -S $(which CMD) | |
$ yum whatprovides $(which CMD) | |
$ dnf provides $(which CMD) | |
$ rpm -qf $(which CMD | |
## snap | |
$ snap install PACKAGE | |
$ snap info PACKAGE | |
$ snap refresh PACKAGE --channel=X.Y/stable --classic | |
## hold a specific version of package | |
$ apt-mark hold PACKAGE-x.y.z | |
or | |
$ vim /etc/apt/preferences.d/PACKAGE | |
Package: linux-image-4.19.0-6-amd64 | |
Pin: version 4.19.81-2+deb10u1 | |
Pin-Priority: -1 | |
--- SSH --- | |
## ssh config (.ssh/config) | |
+ keep connection alive | |
TCPKeepAlive yes | |
ServerAliveInterval 30 | |
+ use private key | |
IdentityFile ~/.ssh/key.openssh | |
+ log level | |
Loglevel QUIET|FATAL|ERROR|VERBOSE|DEBUG|DEBUG1|DEBUG2|DEBUG3:-INFO | |
## verify ssh keypair match | |
$ ssh-keygen -y -e -f <private_key> | |
compare with <public_key> | |
## proxy tcp tunnel over ssh | |
SSH only | |
$ /usr/bin/ssh -o "ServerAliveInterval 30" -o "ServerAliveCountMax 3" -NL [LOCAL_PORT]:127.0.0.1:[REMOTE_PORT] [REMOTE_USER]@[REMOTE_ADRESS] -p [REMOTE_SSH_PORT] | |
AUTOSSH | |
$ /usr/bin/autossh -M 0 -o "ServerAliveInterval 30" -o "ServerAliveCountMax 3" -NL [LOCAL_PORT]:127.0.0.1:[REMOTE_PORT] [REMOTE_USER]@[REMOTE_ADRESS] -p [REMOTE_SSH_PORT] | |
ssh tunnel for mysql | |
$ ssh -N -L LOCAL_PORT:REMOTE_INTERFACE_IP:REMOTE_PORT USER@REMOTE_IP | |
## disable password auth for specific users/group | |
--> /etc/ssh/sshd_config | |
" | |
#PasswordAuthentication yes | |
Match [User|Group|Adress] [USER] | |
PasswordAuthentication no | |
" | |
[USER]: - user1,user2 | |
- group1,group2 | |
- !root | |
- 192.168.1.2,192.168.12.* | |
- hostname | |
- | |
## Without host key verification | |
$ ssh -o "UserKnownHostsFile=/dev/null" -o "StrictHostKeyChecking=no" user@host | |
## save key passphrase (avoid prompt) | |
$ eval `ssh-agent -s` | |
$ ssh-add ~/.ssh/PRIVATE_KEY | |
--- PDF --- | |
## combine pdf | |
$ pdfunite file1.pdf file2.pdf fileN.pdf output.pdf | |
--- USER/GROUP --- | |
## add system user | |
$ sudo adduser --group GROUP --system USER | |
~$ sudo useradd -g GROUP -d /dev/null -s /usr/sbin/nologin USER | |
## delete user from group | |
$ gpasswd -d USER GROUP | |
## run as other user | |
$ runuser -l USER -c 'cmd args' | |
~$ runuser -u USER -- cmd args | |
~$ su - USER -c "cmd args" | |
~$ sudo -u USER cmd args | |
--- WGET --- | |
## pass user/pwd | |
$ wget --http-user USER --http-password PWD URL | |
--- LINK --- | |
## change symbok link target | |
$ readlink -v LINK | |
$ ln -sfn /PATH/TO/NEW/FILE LINK_NAME | |
--- XARGS --- | |
## loop with xargs | |
$ $(CMD TO GET A LIST) |xargs -I {} $(CMD TO LOOP ON THE LIST) {} | |
--- OPENSSL --- | |
[GET INFO] | |
## check CSR (Certificate Signing Request) | |
$ openssl req -text -noout -verify -in CSR.csr | |
## check Private Key | |
$ openssl rsa -in privateKey.key -check | |
## check PEM formatted certificate | |
$ openssl x509 -text -noout -in cert.{pem|crt} | |
## check DER formatted certificate | |
$ openssl x509 -in MYCERT.der -inform der -text | |
## check PKCS#12 file (.pfx or .p12) | |
$ openssl pkcs12 -info -in keyStore.p12 | |
## check certificate from URL | |
$ openssl s_client -showcerts -connect [URL]:443 | |
## check expiration date | |
$ echo | openssl s_client -connect example.com:443 -servername example.com 2>/dev/null | openssl x509 -noout -dates | |
## Check all SAN (Subject Alternative Name) in the certificate | |
$ openssl s_client -connect website.com:443 | openssl x509 -noout -text | grep DNS: | |
## check if cert & private key match | |
$ openssl x509 -modulus -noout -in CERT.crt |md5sum | |
$ openssl rsa -modulus -noout -in PRIV.priv |md5sum | |
[GENERATE CERTIFICATE] | |
## generate Private Key and Certificate Signing Request | |
$ openssl req -out CSR.csr -new -newkey rsa:2048 -nodes -keyout privateKey.key | |
## generate self-signed certificate | |
$ openssl req -x509 -sha256 -nodes -days 365 -newkey rsa:2048 -keyout privateKey.key -out certificate.crt | |
## generate CSR for existing private key | |
$ openssl req -out CSR.csr -key privateKey.key -new | |
## generate CSR based on an existing certificate | |
$ openssl x509 -x509toreq -in certificate.crt -out CSR.csr -signkey privateKey.key | |
## Remove passphrase from Private Key | |
$ openssl rsa -in privateKey.pem -out newPrivateKey.pem | |
## Generate private key and self-signed certificate for localhost | |
$ openssl req -x509 -out localhost.crt -keyout localhost.key \ | |
-newkey rsa:2048 -nodes -sha256 \ | |
-subj '/CN=localhost' -extensions EXT -config <( \ | |
printf "[dn]\nCN=localhost\n[req]\ndistinguished_name = dn\n[EXT]\nsubjectAltName=DNS:localhost\nkeyUsage=digitalSignature\nextendedKeyUsage=serverAuth") | |
[CONVERTING] | |
## Convert DER file (.crt .cer .der) to PEM | |
$ openssl x509 -inform der -in certificate.cer -out certificate.pem | |
## Convert PEM file to DER | |
$ openssl x509 -outform der -in certificate.pem -out certificate.der | |
## Convert PKCS#12 file (.pfx .p12) containing a private key and certificates to PEM | |
$ openssl pkcs12 -in keyStore.pfx -out keyStore.pem -nodes | |
You can add -nocerts to only output the private key or add -nokeys to only output the certificates. | |
## Convert PEM certificate file and a private key to PKCS#12 (.pfx .p12) | |
$ openssl pkcs12 -export -out certificate.pfx -inkey privateKey.key -in certificate.crt -certfile CACert.crt | |
--- MEDIA --- | |
## convert image with IMAGEMAGICK | |
--> install imagemagick | |
Change image format | |
$ convert FILE_SRC.jpg FILE_DEST.png | |
Reduce image quality | |
$ convert FILE_SRC.jpg -quality X% FILE_DEST.jpg | |
Resize image | |
$ convert FILE_SRC.jpg -resize 200x300 FILE_DEST.jpg |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
--- MISC --- | |
## sh-bang | |
#!/bin/bash | |
#!/usr/bin/env bash|python | |
## shell options | |
set -o pipefail | |
set -o errexit | |
set -o nounset | |
set -o xtrace | |
## include debug option in run | |
DEBUG=${DEBUG:=0} | |
[[ $DEBUG -eq 1 ]] && set -o xtrace | |
## redirect script output to stdout & in log file | |
exec &> >(tee -a "/PATH/TO/FILE.LOG") | |
## prevent script to be executed (library case) | |
if [ "${0}" == "${BASH_SOURCE[0]}" ]; then | |
echo "This script is not meant to be executed directly!" | |
echo "Use this script only by sourcing it." | |
exit 1 | |
fi | |
## trap event | |
trap 'cmd' EVENT1 EVENT2 EVENT3 | |
e.g: trap 'exit' INT TERM ERR | |
--- VARIABLES --- | |
## get current directory | |
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )" | |
## array, list | |
ARRAY=('VALUE_1' 'VALUE_2' 'VALUE_N') | |
ARRAY=([NAME_1]='VALUE_1' [NAME_2]='VALUE_2' [NAME_N]='VALUE_N') | |
declare -a ARRAY | |
ARRAY[NAME_1]="VALUE_1" | |
ARRAY[NAME_2]="VALUE_2" | |
ARRAY[NAME_N]="VALUE_N" | |
Get all values: | |
${ARRAY[@]} | |
Get all names in index: | |
${!ARRAY[@]} | |
## parameter Substitution | |
--> http://www.tldp.org/LDP/abs/html/parameter-substitution.html#PARAMSUBREF | |
--> https://www.cyberciti.biz/tips/bash-shell-parameter-substitution-2.html | |
- if not set, use default | |
VAR=${VAR_EXPECTED:-default_value} | |
VAR=${VAR_EXPECTED:-`cmd`} | |
-> VAR=default_value | |
Double assignment: | |
VAR=${VAR_EXPECTED:=default_value} | |
-> VAR=VAR_EXPECTED=default_value | |
- if not set, print error message and exit 1 | |
VAR=${VAR_EXPECTED:?ERROR VAR_EXPECTED is not defined} | |
- replace content | |
a="b.c" | |
d="${a/./-}" | |
echo ${d} -> "b-c" | |
- to uppercase | |
VAR=${name^^} | |
- to lowercase | |
var=${VAR,,} | |
- get last string of an URL | |
${VAR##*/} | |
--- ARRAY --- | |
## declare array var | |
declare -A my_var | |
my_var["content1"]="value1" | |
my_var["content2"]="value2; | |
## append item to array | |
declare -a ARRAY | |
ARRAY=("a" "b") | |
ARRAY+=("c" "d) | |
## array length | |
${#ARRAY_NAME[@]} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
## disable system beep | |
$ rmmod pcspkr; echo "blacklist pcspkr" >>/etc/modprobe.d/blacklist.conf |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
## add trusted CA certificates | |
install package "ca-certificates" | |
$ cp CERT.cert /usr/local/share/ca-certificates/CERT.crt | |
$ sudo /usr/local/share/ca-certificates/ | |
## get certificate from URL | |
$ openssl s_client -showcerts -connect URL:443 </dev/null 2>/dev/null|openssl x509 -outform PEM > CERT.pem | |
--- GET INFO --- | |
## check CSR (Certificate Signing Request) | |
$ openssl req -text -noout -verify -in CSR.csr | |
## check Private Key | |
$ openssl rsa -in privateKey.key -check | |
## check PEM formatted certificate | |
$ openssl x509 -text -noout -in cert.{pem|crt} | |
## check DER formatted certificate | |
$ openssl x509 -in MYCERT.der -inform der -text | |
## check PKCS#12 file (.pfx or .p12) | |
$ openssl pkcs12 -info -in keyStore.p12 | |
## check certificate from URL | |
$ openssl s_client -showcerts -connect [URL]:443 | |
## check expiration date | |
$ echo | openssl s_client -connect example.com:443 -servername example.com 2>/dev/null | openssl x509 -noout -dates | |
## Check all SAN (Subject Alternative Name) in the certificate | |
$ openssl s_client -connect website.com:443 | openssl x509 -noout -text | grep DNS: | |
--- GENERATE CERTIFICATE --- | |
## generate Private Key and Certificate Signing Request | |
$ openssl req -out CSR.csr -new -newkey rsa:2048 -nodes -keyout privateKey.key | |
## generate self-signed certificate | |
$ openssl req -x509 -sha256 -nodes -days 365 -newkey rsa:2048 -keyout privateKey.key -out certificate.crt | |
## generate CSR for existing private key | |
$ openssl req -out CSR.csr -key privateKey.key -new | |
## generate CSR based on an existing certificate | |
$ openssl x509 -x509toreq -in certificate.crt -out CSR.csr -signkey privateKey.key | |
## Remove passphrase from Private Key | |
$ openssl rsa -in privateKey.pem -out newPrivateKey.pem | |
## Generate private key and self-signed certificate for localhost | |
$ openssl req -x509 -out localhost.crt -keyout localhost.key \ | |
-newkey rsa:2048 -nodes -sha256 \ | |
-subj '/CN=localhost' -extensions EXT -config <( \ | |
printf "[dn]\nCN=localhost\n[req]\ndistinguished_name = dn\n[EXT]\nsubjectAltName=DNS:localhost\nkeyUsage=digitalSignature\nextendedKeyUsage=serverAuth") | |
--- CONVERTING USING OpenSSL --- | |
## Convert DER file (.crt .cer .der) to PEM | |
$ openssl x509 -inform der -in certificate.cer -out certificate.pem | |
## Convert PEM file to DER | |
$ openssl x509 -outform der -in certificate.pem -out certificate.der | |
## Convert PKCS#12 file (.pfx .p12) containing a private key and certificates to PEM | |
$ openssl pkcs12 -in keyStore.pfx -out keyStore.pem -nodes | |
You can add -nocerts to only output the private key or add -nokeys to only output the certificates. | |
## Convert PEM certificate file and a private key to PKCS#12 (.pfx .p12) | |
$ openssl pkcs12 -export -out certificate.pfx -inkey privateKey.key -in certificate.crt -certfile CACert.crt |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
A | |
amazon web service: aws |cloud, aws | |
azure: az (local) | |
B | |
C | |
cluster: cls | |
D | |
development: dev |environment | |
E | |
F | |
G | |
google cloud platform: gcp |cloud, gcp | |
H | |
I | |
ingress: ing |cloud, gcp | |
integration: int |environment | |
internal ip: iip |gcp, network | |
J | |
K | |
kubernetes: k8s (official), kub (local) |cloud, container, kubernetes | |
L | |
laboratory: lab |environment | |
load balancer: lb |cloud, network | |
M | |
N | |
O | |
openShift: oc (local) |cloud, kubernetes | |
P | |
production: prd |environment | |
pub/sub: psb |cloud, gcp | |
public ip: pip |gcp, network | |
Q | |
queue: qu | | |
R | |
route: rut |network | |
S | |
T | |
U | |
V | |
virtual machine: vm |cloud | |
virtual ip: vip |network | |
W | |
X | |
Y | |
Z |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
## comment/un-comment mutiple lines | |
+ | |
comment | |
- visual block on needed lines: CTRL+V | |
- insert mode on first line: SHIFT+I | |
- add # | |
- ECHAP | |
un-comment | |
- visual block on needed lines: CTRL+V | |
- remove # on first line | |
- ECHAP | |
+ | |
comment: | |
- shift-v | |
- :norm i# | |
uncomment: | |
- shift-v | |
- :norm x | |
## change colorscheme | |
:colorscheme TAB | |
## if sudo forgotten | |
:w !sudo tee % | |
## uppercase/lowercase | |
~ : case of current character | |
guu : current line from upper to lower. | |
gUU : current LINE from lower to upper. | |
guw : end of current WORD from upper to lower. | |
guaw : all of current WORD to lower. | |
gUw : end of current WORD from lower to upper. | |
gUaw : current WORD to upper. | |
g~~ : Invert case to entire line | |
g~w : Invert case to current WORD | |
guG : lowercase until the end of document. | |
gU) : until end of sentence to upper case | |
gu} : end of paragraph to lower case | |
gU5j : 5 lines below to upper case | |
gu3k : 3 lines above to lower case | |
--- VIMRC --- | |
## ident | |
filetype plugin indent on | |
" show existing tab with 4 spaces width | |
set tabstop=4 | |
" when indenting with '>', use 4 spaces width | |
set shiftwid th=4 | |
" On pressing tab, insert 4 spaces | |
set expandt ab |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
## Create user profile file | |
Test if file exists | |
$ Test-Path $profile | |
False | |
Create user profile file | |
$ New-Item -Path $PROFILE -ItemType File -Force | |
C:\Users\$USER\Documents\WindowsPowerShell\Microsoft.PowerShell_profile.ps1 | |
## Create Alias | |
$ Get-Alias | |
$ New-Alias ALIAS_NAME CMD_NAME | |
or | |
$ Set-Alias -Name ALIAS_NAME -Value 'CMD PARAM=VALUE' | |
## Create function | |
Function _my_function { | |
CMD PARAM=VALUE | |
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
## add missing fonts | |
Download zip -> https://github.com/BannedPatriot/ttf-wps-fonts | |
$ sudo mkdir -p /usr/share/fonts/truetype/msttcorefonts | |
$ cd ttf-wps-fonts && sudo ./install.sh | |
$ sudo fc-cache -fv |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment