This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# -*- coding: utf-8 -*- | |
""" | |
Created on Tue Jul 28 15:28:15 2015 | |
""" | |
from sklearn.cluster import KMeans | |
import numpy as np | |
import pandas | |
dataset = np.array([1, 4, 5, 6, 9]) | |
km = KMeans(n_clusters=3) |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
clear; | |
has_quadprog = exist( 'quadprog' ) == 2 | exist( 'quadprog' ) == 3; | |
has_linprog = exist( 'linprog' ) == 2 | exist( 'linprog' ) == 3; | |
rnstate = randn( 'state' ); randn( 'state', 1 ); | |
s_quiet = cvx_quiet(true); | |
s_pause = cvx_pause(false); | |
cvx_clear; echo on | |
b = [2; 0; 2; 0]; | |
cvx_begin |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
%% demonstrate the linear convergence rate of gradient descent | |
stepsize = 0.05; | |
x = rand(2, 1); | |
f_diff = 1; | |
f = @(x) 1/2*(x(1)^2 + 10 * x(2)^2); | |
first_grad = @(x) [x(1); 10*x(2)]; | |
iter = 0; | |
fprintf('ITER \t F_VAL \t F_VAL_U \t F_DIFF \n'); | |
% fprintf('ITER \t\t F_VAL \t F_VAL_U \t F_DIFF \t F_GRAD \t F_GRAD_U \n'); |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
v = -5:0.5:5; | |
[x, y] = meshgrid(v); | |
z = x .* exp(x.^2 - y.^2) + y.*exp(-x.^2 + y.^2); | |
[px, py] = gradient(z); | |
figure | |
surf(x, y, z) | |
figure | |
contour(x, y, z); | |
hold on | |
quiver(x, y, px, py); |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Windows Registry Editor Version 5.00 | |
[HKEY_CLASSES_ROOT\*\shell\Edit with Sublime Text] | |
@="Edit with &Sublime Text" | |
"Icon"="C:\\Program Files\\Sublime Text 3\\sublime_text.exe,0" | |
"MuiVerb"="Edit with Sublime Text" | |
[HKEY_CLASSES_ROOT\*\shell\Edit with Sublime Text\command] | |
@="C:\\Program Files\\Sublime Text 3\\sublime_text.exe \"%1\"" |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import numpy as np | |
import sys | |
# reference: https://en.wikipedia.org/wiki/Power_iteration | |
def power_method(M): | |
b = np.random.rand(M.shape[1]) | |
diff = np.linalg.norm(b) | |
while diff > 10 ** (-6): | |
b_new = np.dot(M, b) | |
b_norm = np.linalg.norm(b_new) |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import numpy as np | |
import pickle | |
import matplotlib.pyplot as plt | |
import scipy.io as sio | |
import tensorflow as tf | |
tf.enable_eager_execution() | |
tmm = tf.matmul | |
def _test(): |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
yalmip('clear') | |
T = [0 1 0 1; 1 0 1 0; 0 1 0 1; 1 0 1 0]; | |
x = sdpvar(4, 1); | |
y = sdpvar(4, 1); | |
assign(x, randn(4, 1)); | |
assign(y, randn(4, 1)); | |
const = [x'*x <= 1; y'*y <=1]; | |
% const = []; |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
""" Solve a minimax problem | |
The problme is defined as | |
$$ \min_{x} \max_{y} f(x, y) = x^2(y+1) + y^2(x+1)$$ | |
The first order gradient is | |
$$ \frac{\partial f}{\partial x} = 2x(y+1) + y^2 $$ | |
$$ \frac{\partial f}{\partial y} = x^2 + 2y(x+1) $$ | |
From the first order optimality condition, the alternatively solver | |
should solve the problem and converge to a stationary point. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# -*- coding: utf-8 -*- | |
#!/bin/env python | |
import numpy as np | |
import unittest | |
def softmax(x): | |
""" Given a vector, apply the softmax activation function | |
Parameters |