Difference between Tensor and a Variable in Pytorch
Tensor | Variable |
---|---|
A tensor is the basic unit of Pytorch | A variable wraps around the tensor. |
A tensor can be multidimensional. |
Variables act upon tensors and has two parts data and gradient. |
Tensors can perform operations like addition subtraction etc. |
Variables can perform all the operations that are done on tensors plus it calculates gradient. |
Tensors are usually constant. | Variables represent changes in data. |
Tensors can support integer datatypes. | If requires_grad is True variables can support only float and complex datatype. |
Difference between Tensor and Variable in Pytorch
In this article, we are going to see the difference between a Tensor and a variable in Pytorch.
Pytorch is an open-source Machine learning library used for computer vision, Natural language processing, and deep neural network processing. It is a torch-based library. It contains a fundamental set of features that allow numerical computation, deployment, and optimization. Pytorch is built using the tensor class. It was developed by Facebook AI researchers in 2016. The two main features of Pytorch are: it is similar to NumPy but supports GPU, Automatic differentiation is used for the creation and training of deep learning networks and the Models can be deployed in mobile applications, therefore, making it is fast and easy to use. We must be familiar with some modules of Pytorch like nn(used to build neural networks), autograd( automatic differentiation to all the operations performed on the tensors), optim( to optimize the neural network weights to minimize loss), and utils(provide classes for data processing).
Contact Us