Pytorch AutoGrad:: Automatic
Differentiation
● AutoGrad
○ Main Central Package of Pytorch.
○ Provides automatic differentiation of all tensors.
○ Gradients are calculated from the root of graph to the leaf using chain rule of
differential calculus.
● Two important class of AutoGrad package::
a)Variable:: With pytorch v0.4.0, Variable class is deprecated.
Variable and torch.tensor are same.
b)Function
Important attributes of Variable class:
1)data:: Used for accessing tensor.
2)grad::Stores value of gradient computed w.r.t Variable if require_grad
is True and backward() function is called.
3)grad_fn::References a function that has created a Variable.
Example Illustration::
Create tensor and set requires_grad=True .
import torch
x = torch.ones(2, 2, requires_grad=True)
print(x)
Out::
tensor([[1., 1.],
[1., 1.]], requires_grad=True)
Do tensor operation::
y = x + 2
print(y)
Out::
tensor([[3., 3.],
[3., 3.]], grad_fn=<AddBackward0>)
Since y is created as a result of function, it will have grad_fn.
print(y.grad_fn)
Out::
<AddBackward0 object at 0x7f100c04f2b0>
More operations on y::
z = y * y * 3
out = z.mean()
print(z, out)
Output::
tensor([[27., 27.],
[27., 27.]], grad_fn=<MulBackward0>) tensor(27., grad_fn=<MeanBackward0>)
Lets compute gradient by calling backward():
out.backward()
Print gradients d(out)/dx
print(x.grad)
O/p::
tensor([[4.5000, 4.5000],
[4.5000, 4.5000]])
Pytorch auto grad

Pytorch auto grad

  • 1.
  • 2.
    ● AutoGrad ○ MainCentral Package of Pytorch. ○ Provides automatic differentiation of all tensors. ○ Gradients are calculated from the root of graph to the leaf using chain rule of differential calculus. ● Two important class of AutoGrad package:: a)Variable:: With pytorch v0.4.0, Variable class is deprecated. Variable and torch.tensor are same. b)Function
  • 3.
    Important attributes ofVariable class: 1)data:: Used for accessing tensor. 2)grad::Stores value of gradient computed w.r.t Variable if require_grad is True and backward() function is called. 3)grad_fn::References a function that has created a Variable.
  • 4.
    Example Illustration:: Create tensorand set requires_grad=True . import torch x = torch.ones(2, 2, requires_grad=True) print(x) Out:: tensor([[1., 1.], [1., 1.]], requires_grad=True) Do tensor operation:: y = x + 2 print(y) Out:: tensor([[3., 3.], [3., 3.]], grad_fn=<AddBackward0>)
  • 5.
    Since y iscreated as a result of function, it will have grad_fn. print(y.grad_fn) Out:: <AddBackward0 object at 0x7f100c04f2b0> More operations on y:: z = y * y * 3 out = z.mean() print(z, out) Output:: tensor([[27., 27.], [27., 27.]], grad_fn=<MulBackward0>) tensor(27., grad_fn=<MeanBackward0>) Lets compute gradient by calling backward(): out.backward() Print gradients d(out)/dx print(x.grad) O/p:: tensor([[4.5000, 4.5000], [4.5000, 4.5000]])