x += 1and
x = x + 1are not equivalent in Pytorch! The former is in-place and the latter is not, so be careful. More explanation is available here:
x = x + 1is not in-place, because it takes the objects pointed to by
x, creates a new Variable, adds 1 to
xputting the result in the new Variable, and overwrites the object referenced by
xto point to the new var. There are no in-place modifications, you only change Python references (you can check that
id(x)is different before and after that line).
On the other hand, doing
x += 1or
x = 1will modify the data of the Variable in-place, so that no copy is done. However some functions (in your case *) require the inputs to never change after they compute the output, or they wouldn’t be able to compute the gradient. That’s why an error is raised.
x = x + 1is probably safer.