- Disable Windows Fast-Startup
- Disable Secure Boot
The file page.st
goes in the templates/
directory in the Gitit wiki home directory. You'll put the Ace JavaScript and CSS files in static/
.
- Feature Learning
- Learning Feature Representations with K-means by Adam Coates and Andrew Y. Ng
- The devil is in the details: an evaluation of recent feature encoding methods by Chatfield et. al.
- Emergence of Object-Selective Features in Unsupervised Feature Learning by Coates, Ng
- Scaling Learning Algorithms towards AI Benjio & LeCun
- A Theory of Feature Learning by Brendan van Rooyen, Robert C. Williamson
- Deep Learning
- Dropout: A Simple Way to Prevent Neural Networks from Overfitting by Nitish Srivastava, Geoffrey Hinton, Alex Krizhevsky, Ilya Sutskever and Ruslan Salakhutdinov
- [Understanding
def get_jacobian(net, x, noutputs): | |
x = x.squeeze() | |
n = x.size()[0] | |
x = x.repeat(noutputs, 1) | |
x.requires_grad_(True) | |
y = net(x) | |
y.backward(torch.eye(noutputs)) | |
return x.grad.data |
This document was originally written several years ago. At the time I was working as an execution core verification engineer at Arm. The following points are coloured heavily by working in and around the execution cores of various processors. Apply a pinch of salt; points contain varying degrees of opinion.
It is still my opinion that RISC-V could be much better designed; though I will also say that if I was building a 32 or 64-bit CPU today I'd likely implement the architecture to benefit from the existing tooling.
Mostly based upon the RISC-V ISA spec v2.0. Some updates have been made for v2.2
The RISC-V ISA has pursued minimalism to a fault. There is a large emphasis on minimizing instruction count, normalizing encoding, etc. This pursuit of minimalism has resulted in false orthogonalities (such as reusing the same instruction for branches, calls and returns) and a requirement for superfluous instructions which impacts code density both in terms of size and
using AdvancedMH | |
using ArraysOfArrays | |
using CairoMakie | |
using DiffEqNoiseProcess | |
using Distributions | |
using StochasticDiffEq | |
using Turing | |
using Random | |
struct CrankNicolsonProposal{P,T} <: AdvancedMH.Proposal{P} |
import traceback | |
import openai | |
import sys | |
# list models | |
models = openai.Model.list() | |
def baka(error, character="tsundere",): | |
exc_type, exc_value, exc_traceback = sys.exc_info() | |
traceback_list = traceback.extract_tb(exc_traceback) |
import copy | |
import torch | |
import torch.nn as nn | |
class DecayToInit(nn.Module): | |
def __init__(self, param: torch.Tensor): | |
super().__init__() | |
self.register_buffer("param", param) |