Hey friends! Has anyone written any code before where they freeze particular parameters in a layer of a NN?
I don’t mean freezing particular layers of a NN, which I can successfully do (e.g.:
for param in resnet18.fc.parameters():
param.requires_grad = True
What I mean is freezing particular parameters within a layer - by applying some sort of mask or otherwise.
Thanks so much! :)
You must log in or register to comment.