This code snippet shows how we can change a layer in a pretrained model. In the following code, we change all the ReLU activation functions with SELU in a resnet18 model.
import torch
from torchvision import model
resnet18 = model.resnet18(pretrained=True)
def funct(list_mods):