Pytorch:
x += 1andx = x + 1are not equivalent in Pytorch! The former is in-place and the latter is not, so be careful. More explanation is available here:x = x + 1is not in-place, because it takes the objects pointed to byx, creates a new Variable, adds 1 toxputting the result in the new Variable, and overwrites the object referenced byxto point to the new var. There are no in-place modifications, you only change Python references (you can check thatid(x)is different before and after that line).On the other hand, doing
x += 1orx[0] = 1will modify the data of the Variable in-place, so that no copy is done. However some functions (in your case *) require the inputs to never change after they compute the output, or they wouldn’t be able to compute the gradient. That’s why an error is raised.x = x + 1is probably safer.