Skip to content

Instantly share code, notes, and snippets.

View thuvh's full-sized avatar
😀
hello world

Hoai-Thu Vuong thuvh

😀
hello world
View GitHub Profile
@thuvh
thuvh / ms17-010-bsod.py
Created May 18, 2017 03:55
eternalblue poc
from impacket import smb
from struct import pack
import sys
'''
PoC for trigger EternalBlue bug (BSOD)
Reference:
- http://blogs.360.cn/360safe/2017/04/17/nsa-eternalblue-smb/
'''
@thuvh
thuvh / postgres_queries_and_commands.sql
Created July 25, 2017 06:54 — forked from rgreenjr/postgres_queries_and_commands.sql
Useful PostgreSQL Queries and Commands
-- show running queries (pre 9.2)
SELECT procpid, age(query_start, clock_timestamp()), usename, current_query
FROM pg_stat_activity
WHERE current_query != '<IDLE>' AND current_query NOT ILIKE '%pg_stat_activity%'
ORDER BY query_start desc;
-- show running queries (9.2)
SELECT pid, age(query_start, clock_timestamp()), usename, query
FROM pg_stat_activity
WHERE query != '<IDLE>' AND query NOT ILIKE '%pg_stat_activity%'
@thuvh
thuvh / nfs-tunnel.md
Created October 26, 2017 11:44 — forked from proudlygeek/nfs-tunnel.md
Mount NFS Folder via SSH Tunnel

1. Install NFS on Server

Install the required packages (Ubuntu 12.04):

apt-get install nfs-kernel-server portmap

2. Share NFS Folder

Open the exports file:

vim /etc/exports
@thuvh
thuvh / rabbitmq.conf
Created July 16, 2018 09:14
rabbitmq configuration demo
%% -*- mode: erlang -*-
%% ----------------------------------------------------------------------------
%% RabbitMQ Sample Configuration File.
%%
%% Related doc guide: http://www.rabbitmq.com/configure.html. See
%% http://rabbitmq.com/documentation.html for documentation ToC.
%% ----------------------------------------------------------------------------
[
{rabbit,
[%%

Best Practices for Azure Redis

Below are a set of best practices that I recommend for most customers. This information is based on my experience helping hundreds of Azure Redis customers investigate various issues.

Configuration and Concepts

  1. Use Standard or Premium Tier for Production systems. The Basic Tier is a single node system with no data replication and no SLA. Also, use at least a C1 cache. C0 caches are really meant for simple dev/test scenarios since they have a shared CPU core, very little memory, are prone to "noisy neighbor", etc.
  2. Remember that Redis is an In-Memory data store. Read this article so that you are aware of scenarios where data loss can occur.
  3. Configure your client library to use a "connect timeout" of at least 10 to 15 seconds, giving the system time to connect even under higher CPU conditions. If your client or server tend to be under high load
@thuvh
thuvh / sample.cpp
Created September 20, 2018 16:23
insertnodeattail
/*
* For your reference:
*
* SinglyLinkedListNode {
* int data;
* SinglyLinkedListNode* next;
* };
*
*/
SinglyLinkedListNode* insertNodeAtTail(SinglyLinkedListNode* head, int data) {
@thuvh
thuvh / test1.cpp
Created October 24, 2018 17:12
cpp pointer
#include <iostream>
using namespace std;
int main(){
int a = 3;
int *p, *q;
p = &a;
q = p;
@thuvh
thuvh / quandum.py
Created October 29, 2018 02:10
python-vnoi-quandum
from sys import stdin, stdout
from math import floor
def main():
global n,m,a
n,m=[int(x) for x in stdin.readline().split()]
a=[]
for i in range(m):
a.append([int(x) for x in stdin.readline().split()])
g=[]
G=[]
@thuvh
thuvh / kubernetes_commands.md
Created May 23, 2019 14:04 — forked from edsiper/kubernetes_commands.md
Kubernetes Useful Commands
@thuvh
thuvh / hive_druid_integration.md
Last active July 4, 2019 10:45 — forked from rajkrrsingh/hive_druid_integration.md
hive druid integration : quick test to create druid table from hive table
generate data for hive table
echo "generating sample data for hive table"
echo {-1..-181451}hours | xargs -n1 date +"%Y-%m-%d %H:%M:%S" -d >> /tmp/dates.data
echo {-1..-18145}minutes | xargs -n1 date +"%Y-%m-%d %H:%M:%S" -d >> /tmp/dates.data
echo {-1..-1825}days | xargs -n1 date +"%Y-%m-%d %H:%M:%S" -d >> /tmp/dates.data
cat /tmp/dates.data | while read LINE ; do echo $LINE,"user"$((1 + RANDOM % 10000)),$((1 + RANDOM % 1000)) >> /tmp/hive_user_table.data; done

create hive table