3. Tensors
Tensors are a specialized data structure that are very similar
to arrays and matrices.
In PyTorch, tensors are used to encode the inputs and
outputs of a model, as well as the model’s parameters.
Tensors are similar to NumPy’s ndarrays, except that tensors
can run on GPUs.
Tensors are also optimized for automatic differentiation.
https://pytorch.org/tutorials/beginner/basics/tensorqs_tutorial.html
4. Vectors, Matrices, and
Tensors
x 2 R
x = 1
Scalar
e.g.,
(rank-0 tensor)
x 2 Rn
Vector
(rank-1 tensor)
but in this lecture,
we will assume
x 2 Rn⇥1
x =
2
6
6
6
4
x1
x2
.
.
.
xn
3
7
7
7
5
e.g.,
Matrix
(rank-2 tensor)
e.g.,
X =
2
6
6
6
4
x1,1 x1,2 . . . x1,n
x2,1 x2,2 . . . x2,n
.
.
.
.
.
.
...
.
.
.
xm,1 xm,2 . . . xm,n
3
7
7
7
5
X 2 Rm⇥n
x>
=
⇥
x1 x2 . . . xn
⇤
, where x>
2 R1⇥n
.
Vectors, Matrices, and Tensors -- Notational Conventions
5. Vectors, Matrices, and
Tensors
We will often use X as a special convention to refer to the
"design matrix." That is, the matrix containing the training
examples and features (inputs)
X 2 Rn⇥m
and assume the structure
because n is often used to refer to the number of examples in
literature across many disciplines.
X =
2
6
6
6
6
6
6
4
x
[1]
1 x
[1]
2 . . . x
[1]
m
x
[2]
1 x
[2]
2 . . . x
[2]
m
.
.
.
.
.
.
...
.
.
.
x
[n]
1 x
[n]
2 . . . x
[n]
m
3
7
7
7
7
7
7
5
x
[1]
2 = 2nd feature value of the 1st
training example
E.g.,
Vectors, Matrices, and Tensors -- Notational Conventions
7. Example of 3D Tensor
An Example of a 3D Tensor in DL
Image Source: https://code.tutsplus.com/tutorials/create-a-retro-crt-distortion-effect-using-rgb-shifting--active-3359
Single color image
(3D tensor for "multidimensional-array" storage and parallel computing purpose,