The torch.Tensor.backward function in the PyTorch package allows for computation of gradients for any tensor. The backward function propagates the gradients starting from a scalar quantity that represents the loss function with respect to the tensor(s) in question. This is known as backpropagation and is used in training deep learning models.
Example 1: In this example, we create a tensor and apply two operations (addition and multiplication) to it. Then, we compute the gradients using backward and print the gradients.
import torch
# Create a tensor x = torch.tensor([2.0], requires_grad=True)
# Apply some operations to x y = x + 3 z = 2 * y
# Compute gradients z.backward()
# Print gradients print(x.grad)
Output: tensor([2.])
This example shows how to compute the gradient of a tensor with respect to a loss function. As there is only one tensor involved, we simply call the backward function on the scalar representing the loss function.
Example 2: In this example, we create a simple neural network with one hidden layer and one output layer. We use the backward function to compute the gradients of the weights and biases with respect to the loss function.
import torch.nn as nn import torch.optim as optim
# Create a neural network net = nn.Sequential( nn.Linear(10, 20), nn.ReLU(), nn.Linear(20, 1), )
# Define a loss function and optimizer criterion = nn.MSELoss() optimizer = optim.SGD(net.parameters(), lr=0.01)
# Forward pass and loss computation input = torch.randn(10) output = net(input) target = torch.randn(1) loss = criterion(output, target)
# Backward pass and gradient computation net.zero_grad() loss.backward()
# Gradient descent optimizer.step()
This example shows how to use the backward function to compute gradients in a neural network. We define a neural network with one hidden layer and one output layer, and use the mean squared error loss as the loss function. After computing the forward pass and loss, we call the backward function to compute the gradients. Finally, we use an optimizer (in this case stochastic gradient descent) to update the weights and biases of the network.
This set of examples uses the PyTorch package library.
Python Tensor.backward - 30 examples found. These are the top rated real world Python examples of torch.Tensor.backward extracted from open source projects. You can rate examples to help us improve the quality of examples.