Skip to content

Instantly share code, notes, and snippets.

@kmhofmann
kmhofmann / building_tensorflow.md
Last active August 11, 2024 14:14
Building TensorFlow from source

Building TensorFlow from source (TF 2.3.0, Ubuntu 20.04)

Why build from source?

The official instructions on installing TensorFlow are here: https://www.tensorflow.org/install. If you want to install TensorFlow just using pip, you are running a supported Ubuntu LTS distribution, and you're happy to install the respective tested CUDA versions (which often are outdated), by all means go ahead. A good alternative may be to run a Docker image.

I am usually unhappy with installing what in effect are pre-built binaries. These binaries are often not compatible with the Ubuntu version I am running, the CUDA version that I have installed, and so on. Furthermore, they may be slower than binaries optimized for the target architecture, since certain instructions are not being used (e.g. AVX2, FMA).

So installing TensorFlow from source becomes a necessity. The official instructions on building TensorFlow from source are here: ht

Reset

You can reset the commit for a local branches using git reset

To change the commit of a local branch:

git fetch
git reset origin/master --hard
@evanwill
evanwill / gitBash_windows.md
Last active October 24, 2024 17:27
how to add more utilities to git bash for windows, wget, make

How to add more to Git Bash on Windows

Git for Windows comes bundled with the "Git Bash" terminal which is incredibly handy for unix-like commands on a windows machine. It is missing a few standard linux utilities, but it is easy to add ones that have a windows binary available.

The basic idea is that C:\Program Files\Git\mingw64\ is your / directory according to Git Bash (note: depending on how you installed it, the directory might be different. from the start menu, right click on the Git Bash icon and open file location. It might be something like C:\Users\name\AppData\Local\Programs\Git, the mingw64 in this directory is your root. Find it by using pwd -W). If you go to that directory, you will find the typical linux root folder structure (bin, etc, lib and so on).

If you are missing a utility, such as wget, track down a binary for windows and copy the files to the corresponding directories. Sometimes the windows binary have funny prefixes, so

@alexgmcm
alexgmcm / predictOneVsAll.m
Created August 3, 2012 12:45
Stanford Machine Learning Exercise 3 code
function p = predictOneVsAll(all_theta, X)
%PREDICT Predict the label for a trained one-vs-all classifier. The labels
%are in the range 1..K, where K = size(all_theta, 1).
% p = PREDICTONEVSALL(all_theta, X) will return a vector of predictions
% for each example in the matrix X. Note that X contains the examples in
% rows. all_theta is a matrix where the i-th row is a trained logistic
% regression theta vector for the i-th class. You should set p to a vector
% of values from 1..K (e.g., p = [1; 3; 1; 2] predicts classes 1, 3, 1, 2
% for 4 examples)
@alexgmcm
alexgmcm / oneVsAll.m
Created August 2, 2012 13:17
Stanford Machine Learning Course: ex3 oneVsAll.m
function [all_theta] = oneVsAll(X, y, num_labels, lambda)
%ONEVSALL trains multiple logistic regression classifiers and returns all
%the classifiers in a matrix all_theta, where the i-th row of all_theta
%corresponds to the classifier for label i
% [all_theta] = ONEVSALL(X, y, num_labels, lambda) trains num_labels
% logisitc regression classifiers and returns each of these classifiers
% in a matrix all_theta, where the i-th row of all_theta corresponds
% to the classifier for label i
% Some useful variables
@alexgmcm
alexgmcm / costFunctionReg.m
Created August 1, 2012 15:34
Stanford Machine Learning Exercise 2 code
function [J, grad] = costFunctionReg(theta, X, y, lambda)
%COSTFUNCTIONREG Compute cost and gradient for logistic regression with regularization
% J = COSTFUNCTIONREG(theta, X, y, lambda) computes the cost of using
% theta as the parameter for regularized logistic regression and the
% gradient of the cost w.r.t. to the parameters.
% Initialize some useful values
m = length(y); % number of training examples
% You need to return the following variables correctly
@BPScott
BPScott / readme.md
Last active February 7, 2024 16:16
Github suggestion: Per-organization email overrides

This totally happened, y'all can stop +1ing this now. GitHub Blog post. Direct link to settings where you can set this.


#Per-organization / per-repo email overrides - A feature suggestion

Here the concepts "organization" and "user" are interchangeable, I'm talking about an entity that owns a repo, whether it is jQuery or John Resig. I'll stick to using organization as it best represents my original use-case.

##TL;DR

@denzilc
denzilc / nnCostFunction.m
Created November 12, 2011 15:44
Neural Network Cost Function
function [J grad] = nnCostFunction(nn_params, ...
input_layer_size, ...
hidden_layer_size, ...
num_labels, ...
X, y, lambda)
%NNCOSTFUNCTION Implements the neural network cost function for a two layer
%neural network which performs classification
% [J grad] = NNCOSTFUNCTON(nn_params, hidden_layer_size, num_labels, ...
% X, y, lambda) computes the cost and gradient of the neural network. The
% parameters for the neural network are "unrolled" into the vector