-
STL风格版本,利用
reverse_iterator
简化边界判断,可处理重复元素。参见:生成全排列、下一个排列、第k
个排列的。 -
一维DP版本,时间复杂度
O(N^2)
,空间复杂度O(N)
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#ifndef CAFFE_NET_HPP_ | |
#define CAFFE_NET_HPP_ | |
#include <map> | |
#include <set> | |
#include <string> | |
#include <utility> | |
#include <vector> | |
#include "caffe/blob.hpp" |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#include <algorithm> | |
#include <map> | |
#include <set> | |
#include <string> | |
#include <utility> | |
#include <vector> | |
#include "hdf5.h" | |
#include "caffe/common.hpp" |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import numpy as np | |
from scipy import linalg, optimize | |
MAX_ITER = 100 | |
def group_lasso(X, y, alpha, groups, max_iter=MAX_ITER, rtol=1e-6, | |
verbose=False): | |
""" | |
Linear least-squares with l2/l1 regularization solver. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
package topic | |
import spark.broadcast._ | |
import spark.SparkContext | |
import spark.SparkContext._ | |
import spark.RDD | |
import spark.storage.StorageLevel | |
import scala.util.Random | |
import scala.math.{ sqrt, log, pow, abs, exp, min, max } | |
import scala.collection.mutable.HashMap |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import spark.SparkContext | |
import SparkContext._ | |
/** | |
* A port of [[http://blog.echen.me/2012/02/09/movie-recommendations-and-more-via-mapreduce-and-scalding/]] | |
* to Spark. | |
* Uses movie ratings data from MovieLens 100k dataset found at [[http://www.grouplens.org/node/73]] | |
*/ | |
object MovieSimilarities { |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
package topic | |
import spark.broadcast._ | |
import spark.SparkContext | |
import spark.SparkContext._ | |
import spark.RDD | |
import spark.storage.StorageLevel | |
import scala.util.Random | |
import scala.math.{ sqrt, log, pow, abs, exp, min, max } | |
import scala.collection.mutable.HashMap |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
/** | |
* Copyright (c) 2009 Carnegie Mellon University. | |
* All rights reserved. | |
* | |
* Licensed under the Apache License, Version 2.0 (the "License"); | |
* you may not use this file except in compliance with the License. | |
* You may obtain a copy of the License at | |
* | |
* http://www.apache.org/licenses/LICENSE-2.0 | |
* |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
/** | |
* iver: | |
* In ALS, we factorize the matrix into two lower rank matrix such that A ~= U*V' | |
* It is not a symmetric matrix factorization where A ~= Q' Q as you write. | |
* | |
* The non zero entries of the matrix A are the edges between user and item nodes. | |
* The edge direction is always between user -> item. | |
* | |
* The user latent vectors are stored in the user vertices (vertex.num_out_edges() > 0). | |
* The item latent vectors are stored in the item vertices (vertex.num_out_edges() == 0) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#!/usr/bin/env python | |
from numpy import asmatrix, asarray, ones, zeros, mean, sum, arange, prod, dot, loadtxt | |
from numpy.random import random, randint | |
import pickle | |
MISSING_VALUE = -1 # a constant I will use to denote missing integer values | |
def impute_hidden_node(E, I, theta, sample_hidden): |