Step by step process to perform Tensor Broadcasting
1. Different Shapes:
When the two tensors are having different shapes, they can be broadcasted to perform arithmetic operations, when their shapes are compatible.
As we discussed above the conditions for any tensor’s shape to be compatible are-
- If both the shapes are equal
- If one of the dimensions of the shape is 1
Rules:
- In above example we have noticed that T2 is compatible –as it contains 1 in its front dimension of the shape — also both tensors’ Ranks are same
- Expanding T2 from its row length to match the T1.
import tensorflow as tf
T1 = tf.constant([[1, 2, 8], # Shape: (2, 3)
[4, 7, 6]])
T2 = tf.constant([[7, 8, 10]]) # Shape: (1, 3)
# T2 is compatible so broadcasting it to match the size of shape of T1
# Ranks of both the tensors are 2
# Performing arithmetic operation
T3 = T1 + T2
print(T3)
Output:
tf.Tensor(
[[ 8 10 18]
[11 15 16]], shape=(2, 3), dtype=int32)
Now, this T1 and T2 can be used to perform any elementary operations
2. Different Ranks
When the two tensors have different ranks or the tensors are of different dimensions. In which one tensor is of lower dimension and other is of higher dimension.
One of the tensor Rank is 1: If the tensor has Singleton dimension i.e. the dimension of size 1 . It is considered to have size 1 along missing dimension of T2.
- Broadcasting in the below example will start from trailing dimension of the T2.
- This singleton dimension is treated as having size 1.
- Sizes along the corresponding dimension of the corresponding tensors are compared.
- T2 is expanded along its singleton dimension which is trialing dimension to match the size of T1.
- Expanded along next axis of next dimension.
import tensorflow as tf
# Tensor T1:
T1 = tf.constant([[1, 2, 3], #Rank 2 , shape(2,3)
[4, 5, 6]])
# Tensor T2:
T2 = tf.constant([20, 40, 50]) #Rank 1 ,shape(3,)
# Broadcasting T2 to shape (2, 3) to match T1
#arithmetic operation
T3=T1+T2
print(T3)
Output:
tf.Tensor(
[[21 42 53]
[24 45 56]], shape=(2, 3), dtype=int32)
Broadcasted form of T2 will be of Rank 2 and shape (3,2)
One of the tensors Rank 0 (Scaler): Scalar is a tensor of Rank 0 and can be broadcasted along its size to match the size of the higher dimensional tensor to perform elementwise operation.
- When performing the broadcasting of the scaler we have to compare the sizes of the dimensions of both tensors
- Now we have to expand scaler along all the dimensions of the tensor T1.
- Replicate the value of scaler along all the dimensions of the newly broadcasted tensor.
import tensorflow as tf
T1 =tf.constant([[1, 2, 3], # Rank 2 , shape (2,3)
[4, 5, 8]])
T2 =tf.constant (20) # Rank 0 , shape()
#arithmetic operation
T3 =T1+T2
print(T3)
Output:
tf.Tensor(
[[21 22 23]
[24 25 28]], shape=(2, 3), dtype=int32)
Now this is the broadcasted tensor and elementwise operations can be performed on it now.
Both tensors Rank is of higher dimensions:
- When we are given two tensors of different ranks, one is of lower rank and other is of higher rank let’s say (2,3) and (2,3,2) respectively.
- Broadcasting is done on the lower dimension to match the shape of the higher dimension tensor to be able to perform elementwise operation.
- Broadcasting is performed on the lower rank tensor, and it is expanded along its singleton dimension to match the dimension of the other tensor.
Given below the example of two tensors, T1 is the tensor of higher rank and T2 the tensor of lower rank. And broadcasting will be performed on the T2.
- Broadcasting in T2 starts from the trailing dimension of its shape.
- Compare the sizes of the corresponding dimensions of the two tensors.
- In tensor T2 of shape (2,3 ) we consider a singleton dimension i.e (the dimension which has size or number of elements along that dimension is one only) along a trailing end by adding new axis at end.
- We can add new axis at the end by ‘tf.expand_dims(input, axis ,name)’ in the tensorflow —- And the resulting shape of T2 becomes as (2,3,1) — after addition of new axis at the trailing point.
- Now when we check for of them for compatibility they both are compatible now —- as tensor T2 as a 1 at the end
- The broadcasting occurs and T2 is expanded along the singleton dimension i.e 1 to match the shape of the tensor T1.
- Resulting tensor T2 will be of shape (2,3,2) after broadcasting to match the shape of the tensor T1.
- Now we can perform elementwise operation and resultant also be the shape of (2,3,2)
import tensorflow as tf
# tensor T1
T1 =tf. constant([[[10, 11], #Rank 3 and shape(2,3,2)
[20, 20],
[30, 30]],
[ [40, 41],
[50, 50],
[60, 60] ]])
# tensor T2
T2= tf.constant([[1, 2, 3], # Rank 2 and shape (2,3)
[4, 5, 6]])
#Adding new axis to T2 by .expand_dims() at its trailing end
broadcasted_T2 = tf.expand_dims(T2, axis=-1)
# arithmetic operation
T3=broadcasted_T2+ T1
print(T3)
Output:
tf.Tensor(
[[[11 12]
[22 22]
[33 33]]
[[44 45]
[55 55]
[66 66]]], shape=(2, 3, 2), dtype=int32)
Now this is the broadcasted tensor and element wise operations can be performed on it now.
Tensor Broadcasting
Tensor broadcasting is a concept of array processing libraries like TensorFlow and NumPy, it allows for implicit element-wise operations between arrays of different shapes. In this article, we will learn about tensor broadcasting, it’s significance and steps to perform tensor broadcasting.
Table of Content
- Tensor Broadcasting
- Significance of Tensor Broadcasting in Array Operations
- Prerequisites
- Step by step process to perform Tensor Broadcasting
- Applications of Tensor Broadcasting
Contact Us