Skip to main content

Today I learnt

·1 min

Pytorch:

  • x += 1 and x = x + 1 are not equivalent in Pytorch! The former is in-place and the latter is not, so be careful. More explanation is available here:

    x = x + 1 is not in-place, because it takes the objects pointed to by x, creates a new Variable, adds 1 to x putting the result in the new Variable, and overwrites the object referenced by x to point to the new var. There are no in-place modifications, you only change Python references (you can check that id(x) is different before and after that line).

    On the other hand, doing x += 1 or x[0] = 1 will modify the data of the Variable in-place, so that no copy is done. However some functions (in your case *) require the inputs to never change after they compute the output, or they wouldn’t be able to compute the gradient. That’s why an error is raised.

    • x = x + 1 is probably safer.