Skip to content

Instantly share code, notes, and snippets.

View iklobato's full-sized avatar
🏠
Working from home

iklobato iklobato

🏠
Working from home
  • Campinas
  • 21:22 (UTC -03:00)
View GitHub Profile
@iklobato
iklobato / bind_keys.md
Last active January 16, 2024 15:33
Enable Word Navigation in iTerm 2 Using Option (Alt) and Arrow Keys

Optimize iTerm 2 Word Navigation Configuration

Enhance your iTerm 2 command-line experience with improved word navigation. This guide covers two approaches: a convenient one-liner script and a manual method.

One-Liner Approach

Learn how to quickly configure word navigation in iTerm 2 using a one-liner script. This script enhances key bindings with the Option (Alt) key and arrow keys, facilitating smooth movement one word backward and forward.

Implementation

@iklobato
iklobato / study_plan.md
Last active January 15, 2024 21:46
Comprehensive 30-Day Algorithm Study Plan for Coding Interviews

INTERVIEWS

Comprehensive 30-Day Algorithm Study Plan for Coding Interviews

Based on: https://www.hackerrank.com/interview/interview-preparation-kit

This 30-day study plan is meticulously crafted to help you achieve a comprehensive understanding of fundamental algorithmic concepts, strategically tailored for success in coding interviews. The primary goal of this plan is to equip you with the skills and knowledge required to confidently tackle algorithmic challenges commonly encountered during technical interviews.

The plan kicks off with a warm-up phase, allowing you to familiarize yourself with the HackerRank environment and ease into problem-solving. As you progress through each day, the plan systematically delves into key algorithmic topics, offering a detailed breakdown of subtopics for a more thorough exploration.

@iklobato
iklobato / pyenv_install.sh
Last active January 12, 2024 01:46
pyenv linux install
#!/bin/bash
#
# Pyenv Installer Script
#
# This script automates the installation of Pyenv and the latest Python versions
# on a Debian-based system or macOS. It installs necessary dependencies, sets up Pyenv,
# restarts the terminal, and installs the latest Python versions concurrently.
#
# Note: Ensure you have sudo privileges to install system dependencies.
#
@iklobato
iklobato / docker_install.sh
Last active January 5, 2024 02:54
Docker installation on ubuntu, single script
##############################################################################
# Docker Installation Script for Ubuntu #
# #
# Automates the installation of Docker on Ubuntu systems. Checks if Docker #
# is already installed; if not, it proceeds with the installation steps. #
# #
# This script simplifies Docker installation on Ubuntu by updating package #
# lists, installing necessary dependencies like ca-certificates, curl, and #
# gnupg, setting up the Docker repository, and installing Docker CE, #
# Docker CLI, containerd.io, docker-buildx-plugin, and docker-compose-plugin #
@iklobato
iklobato / best_machine.py
Last active October 24, 2023 23:43
Scrap EC2 to get the best instance with more processing and lowest cost
from datetime import datetime
import pandas as pd
import requests
pd.set_option('display.max_columns', 8)
pd.set_option('max_seq_item', None)
pd.set_option('display.width', 200)
zones = [
'US East (N. Virginia)',
@iklobato
iklobato / bucket-migration.sh
Last active November 22, 2023 15:57
bucket-migration
#!/bin/bash
# Purpose: This Bash script automates data migration between BigQuery and Google Cloud's storage.
# It streamlines the process by using Google Cloud Storage as temporary storage. It's designed to export table
# schemas to JSON, load data from CSV files into BigQuery for analysis, and securely move data between different
# storage buckets. Additionally, it simplifies bucket management for organized data and enhanced redundancy.
# Value: For GCP users, this script is invaluable for simplifying and expediting data migration tasks between
# BigQuery, making it a crucial resource for data management.
# How to Run: Before executing, set key variables like `SOURCE_PROJECT_ID`, `DESTINATION_PROJECT_ID`,
# and `DATASET_NAME`. Then, prepare an input file listing the table names you want to process. Run the script in
@iklobato
iklobato / migrate_sh.sh
Last active August 30, 2024 12:54
This script is responsible to move that from bigquery between different Google Cloud accounts
#!/bin/bash
if [[ "$*" == *-v* ]]; then
set -x
fi
source_project_id="${SOURCE_PROJECT_ID}"
destination_project_id="${DESTINATION_PROJECT_ID}"
dataset_name="${DATASET_NAME}"
@iklobato
iklobato / buckets.md
Last active July 27, 2023 19:56
Moving data between buckets

From bucket to bucket

The given command is a single-line command using gsutil, the command-line tool for interacting with Google Cloud Storage. It performs a high-performance data transfer (copy) operation (cp) from one location in Google Cloud Storage (gs://source) to another location (gs://destination).

gsutil -o "GSUtil:parallel_composite_upload_threshold=150M" \
       -o "GSUtil:parallel_thread_count=10" \
       -o "GSUtil:check_hashes=if_fast_else_skip" \
       -h "Content-Encoding:gzip" \
 -o "GSUtil:sliced_object_download=true" \
@iklobato
iklobato / django_min.py
Created February 11, 2023 02:45
creating an endpoint model using django fw
import os
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'minimaldjango.settings')
from django.forms import DateField
from rest_framework import routers
from rest_framework.serializers import ModelSerializer
from rest_framework.viewsets import ModelViewSet
from django.db.models import (
import os
from random import choice
import argparse
from tqdm import tqdm
import base64
import re
import requests
FILE_NAME = 'proxies_downloaded.txt'