Sequence is defined by -
- A0 = 1
- Ai+1 = Ai + 1 or Ai * 2
Now given An = some number find the smallest n.
def find_smallest_n(An):
import numpy as np | |
from sklearn.datasets import make_classification | |
# Data with features in different scales | |
n_classes = 2 | |
X_clean, y_clean = make_classification( | |
n_samples=500, n_features=2, n_redundant=0, | |
scale=(10, 100), random_state=0) | |
# Add outliers to the data |
""" | |
=============================== | |
Gradient Boosting Classifier CV | |
=============================== | |
Gradient boosting is an ensembling technique where several weak learners | |
(regression trees) are combined to yield a powerful single model, in an | |
iterative fashion. | |
:class:`sklearn.ensemble.GradientBoostingClassifierCV` enables us to |
# Write a function that takes a list of string and returns the sum of the list items that represents an integer. | |
def sum_up_int_terms(string): | |
sum = 0 | |
for item in string: | |
try: | |
sum += int(item) | |
except: | |
pass | |
return sum |
import numpy as np | |
from sklearn.utils.validation import check_random_state | |
from sklearn.externals import six | |
from functools import partial | |
def mcar_mask(X, y=None, proba=0.1, random_state=None): | |
"""Generate a MCAR mask to uniformly drop values |
import numpy as np | |
from sklearn.utils.validation import check_random_state | |
from sklearn.externals import six | |
from functools import partial | |
def drop_values_mcar(X, y=None, missing_values=np.nan, | |
proba=0.1, random_state=None): |
missing_rate : float in range [0, 1), default 0.1
The absolute fraction of values that must be missing.
If there are previously existing missing values (say x fraction of
values were already missing), an additional ``missing_rate - x``
fraction of values will be dropped further to achieve a total of
``missing_rate`` fraction of missing values.
That is, at the end, the dataset will contain a total of
(X.shape[0] * X.shape[1]) * missing_rate
numbers of missing values.
export LANG=en_US.UTF-8 | |
# COMMON CONFS | |
# Use pgrep with full command path and display the name | |
alias checktuns="ps x | grep 'ssh -N'" | |
alias pgrep="ps x | grep " | |
# Set the scikit path env var | |
export SCIKIT_LEARN_PATH=~/raghav/code/scikit-learn |