I hereby claim:
- I am effigies on github.
- I am effigies (https://keybase.io/effigies) on keybase.
- I have a public key whose fingerprint is 1430 CCDA 4D04 39C7 598A 5815 D0CD EB5D 32A0 D8C7
To claim this, I am signing this object:
I hereby claim:
To claim this, I am signing this object:
# Based off of Mumford et al. (2011) | |
# Derived from a figure, as there was no actual math presented :\ | |
# | |
# It's a sort of minimal-assumption regression regularization method | |
import numpy as np | |
def turnerLeastSquares(designMatrix, samples): | |
nuisance = np.sum(designMatrix, axis=1).reshape((designMatrix.shape[0], 1)) | |
#!/bin/sh | |
# | |
# /etc/chromium-browser/default | |
# | |
# Default settings for chromium-browser. This file is sourced by /bin/sh from | |
# /usr/bin/chromium-browser | |
# Options to pass to chromium-browser | |
MIN_SSL="tls1" | |
RC4="0x0004,0x0005,0xc007,0xc011" |
#!python3 | |
class TypedDict(dict): | |
keytype = valtype = keymap = valmap = valid_keys = typemap = None | |
def __init__(self, mapping=None, **kwargs): | |
if self.typemap is not None and self.valid_keys is None: | |
self.valid_keys = set(self.typemap) | |
super(TypedDict, self).__init__() | |
if mapping is None: |
#!/usr/bin/env python3 | |
import sys | |
import os | |
import re | |
import argparse | |
import subprocess | |
import tempfile | |
from functools import partial | |
__version__ = 0.1 |
import traitlets | |
class _Undefined(object): | |
obj = None | |
def __new__(cls): | |
if cls.obj is None: | |
cls.obj = object.__new__(cls) | |
return cls.obj | |
class _UseDefault(_Undefined): |
name: base | |
channels: | |
- conda-forge | |
- defaults | |
dependencies: | |
- apptools=4.4.0=py27_0 | |
- ca-certificates=2018.4.16=0 | |
- cairo=1.14.6=0 | |
- conda=4.5.4=py27_0 | |
- conda-env=2.6.0=0 |
Cluster thresholding in multi-voxel pattern analysis (MVPA) is an open problem, as many figures of merit are possible, few (if any) of which have been sufficiently analyzed to permit a parametric solution or a guarantee of compatibility with pre-computed simulations. Stelzer, et al. 2012 represents probably the most conservative approach, constructing voxel-wise and then cluster-wise null distributions at the group level, based on permuting the training labels at the individual level.
In Mapping the cortical representation of speech sounds in a syllable repetition task, we adapted this approach to skip the voxel-wise null distribution,
This document lays out a set of Python packaging practices. I don't claim they are best practices, but they fit my needs, and might fit yours.
This document has been superseded as of July 2020.
This was written in July 2019. As of this writing Python 2.7 and Python 3.5 still have not
{ | |
"Name": "NARPS", | |
"Description": "Basic NARPS model", | |
"Input": { | |
"task": "MGT" | |
}, | |
"Steps": [ | |
{ | |
"Level": "run", | |
"Transformations": [ |