Skip to content

Instantly share code, notes, and snippets.

View luthfianto's full-sized avatar

Rizky Luthfianto luthfianto

View GitHub Profile
# 1. Delete all existing rules
iptables -F
# 2. Set default chain policies
iptables -P INPUT DROP
iptables -P FORWARD DROP
iptables -P OUTPUT DROP
# 4. Allow ALL incoming SSH
iptables -A INPUT -i eth0 -p tcp --dport 22 -m state --state NEW,ESTABLISHED -j ACCEPT
package quickcheck
import common._
import org.scalacheck._
import Arbitrary._
import Gen._
import Prop._
import Math._
@luthfianto
luthfianto / theano_mlp_small.py
Created April 19, 2016 10:23 — forked from honnibal/theano_mlp_small.py
Stripped-down example of Multi-layer Perceptron MLP in Theano
"""A stripped-down MLP example, using Theano.
Based on the tutorial here: http://deeplearning.net/tutorial/mlp.html
This example trims away some complexities, and makes it easier to see how Theano works.
Design changes:
* Model compiled in a distinct function, so that symbolic variables are not in run-time scope.
* No classes. Network shown by chained function calls.
@luthfianto
luthfianto / keras_attention_wrapper.py
Created January 15, 2017 09:19 — forked from wassname/keras_attention_wrapper.py
A keras attention layer that wraps RNN layers.
"""
A keras attention layer that wraps RNN layers.
Based on tensorflows [attention_decoder](https://github.com/tensorflow/tensorflow/blob/c8a45a8e236776bed1d14fd71f3b6755bd63cc58/tensorflow/python/ops/seq2seq.py#L506)
and [Grammar as a Foreign Language](https://arxiv.org/abs/1412.7449).
date: 20161101
author: wassname
url: https://gist.github.com/wassname/5292f95000e409e239b9dc973295327a
"""
@luthfianto
luthfianto / Attention.py
Last active July 15, 2020 09:27 — forked from cbaziotis/Attention.py
Keras Layer that implements an Attention mechanism for temporal data. Supports Masking. Follows the work of Raffel et al. [https://arxiv.org/abs/1512.08756]
from keras.layers.core import Layer
from keras import initializers, regularizers, constraints
from keras import backend as K
class Attention(Layer):
def __init__(self,
kernel_regularizer=None, bias_regularizer=None,
kernel_constraint=None, bias_constraint=None,
use_bias=True, **kwargs):
"""
@luthfianto
luthfianto / TFQueueKeras.py
Created March 29, 2017 17:11 — forked from Dref360/TFQueueKeras.py
An example of using keras with tf queues, this handle BatchNorm
import tensorflow as tf
import numpy as np
import keras
import keras.backend as K
from functools import reduce
from keras.models import Model
import keras.callbacks as cbks
from keras.applications import ResNet50
from keras.layers import Conv2D
import threading
#!/bin/zsh
# WARNING! The script is meant to show how and what can be disabled. Don’t use it as it is, adapt it to your needs.
# Credit: Original idea and script disable.sh by pwnsdx https://gist.github.com/pwnsdx/d87b034c4c0210b988040ad2f85a68d3
# Disabling unwanted services on macOS Big Sur (11), macOS Monterey (12), macOS Ventura (13) and macOS Sonoma (14)
# Disabling SIP is required ("csrutil disable" from Terminal in Recovery)
# Modifications are written in /private/var/db/com.apple.xpc.launchd/ disabled.plist, disabled.501.plist
# To revert, delete /private/var/db/com.apple.xpc.launchd/ disabled.plist and disabled.501.plist and reboot; sudo rm -r /private/var/db/com.apple.xpc.launchd/*
# user