Skip to content

Instantly share code, notes, and snippets.

double svm_predict_probability(
const svm_model *model, const svm_node *x, double *prob_estimates)
{
if ((model->param.svm_type == C_SVC || model->param.svm_type == NU_SVC) &&
model->probA!=NULL && model->probB!=NULL)
{
int i;
int nr_class = model->nr_class;
double *dec_values = Malloc(double, nr_class*(nr_class-1)/2);
svm_predict_values(model, x, dec_values);
@dniku
dniku / i.pi
Last active August 29, 2015 14:17
Problem I from VK wild-card round #1. The language is Picat. This solution fails with WA #18.
import cp.
read_list=List =>
L=[],
Len=read_int(),
foreach(I in 1..Len)
V := read_int(),
L := [V|L]
end,
List=L.
@dniku
dniku / h.pi
Last active August 29, 2015 14:17
Problem H from VK wild-card round #1. The language is Picat.
import cp.
main =>
N = read_int(),
X = new_array(N),
Y = new_array(N),
foreach(I in 1..N)
X[I] := read_int(),
Y[I] := read_int()
end,
require 'matrix'
class Vector
public :"[]="#, :set_element, :set_component
def norm_inf
return (self.map {|x| x.abs}).max
end
def clone
I0322 23:04:55.893093 32722 caffe.cpp:117] Use CPU.
I0322 23:04:55.894064 32722 caffe.cpp:121] Starting Optimization
I0322 23:04:55.894438 32722 solver.cpp:32] Initializing solver from parameters:
test_iter: 100
test_interval: 500
base_lr: 0.01
display: 100
max_iter: 10000
lr_policy: "inv"
gamma: 0.0001
@dniku
dniku / lenet_train_test_modified.prototxt
Created March 22, 2015 20:34
official example modified for image size 12x12
name: "LeNet"
layers {
name: "mnist"
type: IMAGE_DATA
top: "data"
top: "label"
image_data_param {
source: "mnist_train_index.txt"
batch_size: 64
is_color: false
@dniku
dniku / lenet_solver_modified.prototxt
Created March 22, 2015 20:34
official example modified for image size 12x12
# The train/test net protocol buffer definition
net: "lenet_train_test_modified.prototxt"
# test_iter specifies how many forward passes the test should carry out.
# In the case of MNIST, we have test batch size 100 and 100 test iterations,
# covering the full 10,000 testing images.
test_iter: 100
# Carry out testing every 500 training iterations.
test_interval: 500
# The base learning rate, momentum and the weight decay of the network.
base_lr: 0.01
from __future__ import division, print_function
from pprint import pprint
import glob
import caffe
MODEL_FILE = '../models/lenet.prototxt'
PRETRAINED_FILE = '../models/lenet_pretrained.caffemodel'
IMAGE_DIR = 'test'
@dniku
dniku / expansions.py
Created March 19, 2015 17:54
Expansions of a natural number into sums
def get_expansions(num, nterms):
result = []
def run(cur_sum, cur_vals, remain):
if remain == 0:
if cur_sum == num:
result.append(tuple(cur_vals))
return
elif remain == 1:
cur_vals.append(num - cur_sum)
<!doctype html>
<html>
<head>
<meta http-equiv="content-type" content="text/html; charset=utf-8">
<title>Sandbox</title>
</head>
<body>
<script src="sandbox.js"></script>
</body>