Skip to content

Instantly share code, notes, and snippets.

@jamesmurdza
Created August 31, 2024 07:17
Show Gist options
  • Save jamesmurdza/07571a84794a3b6fd6030fc6fd2abd57 to your computer and use it in GitHub Desktop.
Save jamesmurdza/07571a84794a3b6fd6030fc6fd2abd57 to your computer and use it in GitHub Desktop.

00:00:18 welcome all right go ahead

00:00:25 today we are going to be talking about new networks

00:00:32 we'll discuss the motivation behind why researchers started looking into new networks

00:00:44 initially, we will cover why these networks have shown impressive performance now and not earlier

00:01:01 we will build on that and delve into the mathematical background

00:01:08 finally, we will talk about the forward pass, the backward pass, and activation functions

00:01:17 can anyone tell me what you know about neural networks?

00:01:30 neural networks are used for large language models

00:01:51 they might be trying to replicate how the human brain functions

00:02:10 a neural network is structured with layers of linear transformations and nonlinear activation functions

00:02:40 the motivation behind neural networks is biologically inspired from how our brains work

00:03:04 neuroscientists know very little about how the brain works

00:03:37 the basic building block of a neural network is called the neuron

00:04:17 a biological neuron has dendrites

00:04:42 a biological neuron uses dendrites to take inputs, processes them, and produces an output

00:05:01 neural networks process data similarly by taking input, processing it, and producing an output

00:05:47 a single perceptron in a neural network takes input data in the form of a vector

00:16:01 processing in a perceptron involves weights, biases, and an activation function

00:16:18 without activation functions, neural networks reduce to simple linear regression

00:09:07 the intuition behind neural networks is to take raw features and combine them in certain ways to produce better features

00:13:24 neural networks shine in combining features in complex ways unknown to us

00:13:42 hidden layers in neural networks enable them to spot patterns and model complex behaviors

00:28:05 neural networks can have thousands of hidden layers

00:14:27 let's define the formal mathematical definition of a neural network

00:29:09 a forward pass or forward propagation involves making a decision from a neural network

00:51:13 backpropagation is used to train neural networks by computing gradients and updating weights iteratively

00:24:51 neural networks are often called black-box algorithms because their internal workings are not easy to interpret

00:28:19 deep learning involves networks with multiple hidden layers for modeling complex data

00:57:16 why new networks are looked into is because of the potential to model not directly obvious complex data

01:00:56 tensorflow and keras libraries simplify the implementation of neural networks

01:03:57 we can optimize a neural network by experimenting with its architecture and parameters

01:11:00 the performance of neural networks should ideally improve upon traditional methods

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment