Skip to content

Instantly share code, notes, and snippets.

@andreaschandra
Created August 26, 2021 07:38
Show Gist options
  • Save andreaschandra/0f2b98c06ad1df7f2c71ac9953dee5b8 to your computer and use it in GitHub Desktop.
Save andreaschandra/0f2b98c06ad1df7f2c71ac9953dee5b8 to your computer and use it in GitHub Desktop.
import torch
x = torch.rand(10)
print(x)
> tensor([0.2791, 0.7676, 0.5146, 0.5865, 0.5029, 0.5618, 0.2659, 0.9412, 0.4960,
0.1228])
# apply softmax
torch.softmax(x, dim=0)
> tensor([0.0778, 0.1268, 0.0984, 0.1058, 0.0973, 0.1032, 0.0768, 0.1508, 0.0966,
0.0665])
# apply sigmoid
torch.sigmoid(x)
> tensor([0.5693, 0.6830, 0.6259, 0.6426, 0.6231, 0.6369, 0.5661, 0.7193, 0.6215,
0.5307])
# apply relu
torch.relu(x)
> tensor([0.2791, 0.7676, 0.5146, 0.5865, 0.5029, 0.5618, 0.2659, 0.9412, 0.4960,
0.1228])
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment