.. note::
    :class: sphx-glr-download-link-note

    Click :ref:`here <sphx_glr_download_beginner_former_torchies_tensor_tutorial_old.py>` to download the full example code
.. rst-class:: sphx-glr-example-title

.. _sphx_glr_beginner_former_torchies_tensor_tutorial_old.py:


Tensors
=======

Tensors behave almost exactly the same way in PyTorch as they do in
Torch.

Create a tensor of size (5 x 7) with uninitialized memory:



.. code-block:: default


    import torch
    a = torch.empty(5, 7, dtype=torch.float)


Initialize a double tensor randomized with a normal distribution with mean=0,
var=1:


.. code-block:: default


    a = torch.randn(5, 7, dtype=torch.double)
    print(a)
    print(a.size())


.. note::
    ``torch.Size`` is in fact a tuple, so it supports the same operations

Inplace / Out-of-place
----------------------

The first difference is that ALL operations on the tensor that operate
in-place on it will have an ``_`` postfix. For example, ``add`` is the
out-of-place version, and ``add_`` is the in-place version.


.. code-block:: default


    a.fill_(3.5)
    # a has now been filled with the value 3.5

    b = a.add(4.0)
    # a is still filled with 3.5
    # new tensor b is returned with values 3.5 + 4.0 = 7.5

    print(a, b)


Some operations like ``narrow`` do not have in-place versions, and
hence, ``.narrow_`` does not exist. Similarly, some operations like
``fill_`` do not have an out-of-place version, so ``.fill`` does not
exist.

Zero Indexing
-------------

Another difference is that Tensors are zero-indexed. (In lua, tensors are
one-indexed)


.. code-block:: default


    b = a[0, 3]  # select 1st row, 4th column from a


Tensors can be also indexed with Python's slicing


.. code-block:: default


    b = a[:, 3:5]  # selects all rows, 4th column and  5th column from a


No camel casing
---------------

The next small difference is that all functions are now NOT camelCase
anymore. For example ``indexAdd`` is now called ``index_add_``


.. code-block:: default



    x = torch.ones(5, 5)
    print(x)



.. code-block:: default


    z = torch.empty(5, 2)
    z[:, 0] = 10
    z[:, 1] = 100
    print(z)



.. code-block:: default

    x.index_add_(1, torch.tensor([4, 0], dtype=torch.long), z)
    print(x)


Numpy Bridge
------------

Converting a torch Tensor to a numpy array and vice versa is a breeze.
The torch Tensor and numpy array will share their underlying memory
locations, and changing one will change the other.

Converting torch Tensor to numpy Array
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^


.. code-block:: default


    a = torch.ones(5)
    print(a)



.. code-block:: default


    b = a.numpy()
    print(b)



.. code-block:: default

    a.add_(1)
    print(a)
    print(b) 	# see how the numpy array changed in value



Converting numpy Array to torch Tensor
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^


.. code-block:: default


    import numpy as np
    a = np.ones(5)
    b = torch.from_numpy(a)
    np.add(a, 1, out=a)
    print(a)
    print(b)  # see how changing the np array changed the torch Tensor automatically


All the Tensors on the CPU except a CharTensor support converting to
NumPy and back.

CUDA Tensors
------------

CUDA Tensors are nice and easy in pytorch, and transfering a CUDA tensor
from the CPU to GPU will retain its underlying type.


.. code-block:: default


    # let us run this cell only if CUDA is available
    if torch.cuda.is_available():

        # creates a LongTensor and transfers it
        # to GPU as torch.cuda.LongTensor
        a = torch.full((10,), 3, device=torch.device("cuda"))
        print(type(a))
        b = a.to(torch.device("cpu"))
        # transfers it to CPU, back to
        # being a torch.LongTensor


.. rst-class:: sphx-glr-timing

   **Total running time of the script:** ( 0 minutes  0.000 seconds)


.. _sphx_glr_download_beginner_former_torchies_tensor_tutorial_old.py:


.. only :: html

 .. container:: sphx-glr-footer
    :class: sphx-glr-footer-example



  .. container:: sphx-glr-download

     :download:`Download Python source code: tensor_tutorial_old.py <tensor_tutorial_old.py>`



  .. container:: sphx-glr-download

     :download:`Download Jupyter notebook: tensor_tutorial_old.ipynb <tensor_tutorial_old.ipynb>`


.. only:: html

 .. rst-class:: sphx-glr-signature

    `Gallery generated by Sphinx-Gallery <https://sphinx-gallery.readthedocs.io>`_