Skip to content

Instantly share code, notes, and snippets.

View Divyosmi's full-sized avatar
🛩️
Let's #code

Divyosmi Goswami Divyosmi

🛩️
Let's #code
View GitHub Profile
<!DOCTYPE html>
<html>
<head>
<title>k-means</title>
</head>
<body>
<img src="https://upload.wikimedia.org/wikipedia/commons/e/ea/K-means_convergence.gif">
</body>

#Thank you

Thank you to everyone for visiting the website. Join me on my journey. Please give your reviews using the vote.

import numpy as np
class perceptron:
def init(self,x):
self.x = x
self.weight = np.random.rand(len(x[0])+1)
# Make a prediction with weights
def predict(row, weights):
activation = weights[0] for i in range(len(row)-1):
activation += weights[i + 1] * row[i]
return 1.0 if activation >= 0.0 else 0.0
def accuracy_metric(actual, predicted):
correct = 0 for i in range(len(actual)):
if actual[i] == predicted[i]:
correct += 1
return correct / float(len(actual)) * 100.0
def train_weights(train, l_rate=0.001, n_epoch=500,verbose=True):
weights = [0.0 for i in range(len(train[0]))]
for epoch in range(n_epoch):
sum_error = 0.0
for row in train:
prediction = predict(row, weights)
dataset = [[2.7810836,2.550537003,0],
[1.465489372,2.362125076,0], [3.396561688,4.400293529,0],
[1.38807019,1.850220317,0],
[3.06407232,3.005305973,0],
[7.627531214,2.759262235,1],
# Make a prediction with weights
def predict(row, weights):
activation = weights[0]
for i in range(len(row)-1):
activation += weights[i + 1] * row[i]
activation = sum(weight_i * x_i) + bias
prediction = 1.0 if activation >= 0.0 else 0.0
w is weight, learning rate is a hyper parameter that deals with the amount in which one has to tune it to update their weights and bias, expected is the value of the prediction and predicted is what you
w = w + learning_rate * (expected - predicted) * x