Today I learnt
·1 min
Pytorch:
x += 1
andx = x + 1
are not equivalent in Pytorch! The former is in-place and the latter is not, so be careful. More explanation is available here:x = x + 1
is not in-place, because it takes the objects pointed to byx
, creates a new Variable, adds 1 tox
putting the result in the new Variable, and overwrites the object referenced byx
to point to the new var. There are no in-place modifications, you only change Python references (you can check thatid(x)
is different before and after that line).On the other hand, doing
x += 1
orx[0] = 1
will modify the data of the Variable in-place, so that no copy is done. However some functions (in your case *) require the inputs to never change after they compute the output, or they wouldn’t be able to compute the gradient. That’s why an error is raised.x = x + 1
is probably safer.