Deep Learning Track WiSe 24/25
Deep Learning Track WiSe 24/25
Deep Learning Track WiSe 24/25
  • Welcome to the Deep Learning Track
  • Setup
  • Learning Material
  • Section 1 - The Math
    • Derivatives and Gradients
    • Vectors, Matrices and Tensors
    • The power of matrix computation
    • Exercise - Matrix Computation
  • Section 2 - The Data
    • PyTorch Datasets and Data Loaders
    • Working with Data Tables
    • Exercise - Loading Data from a CSV file
    • Working with Images
    • Exercise - Image Datasets
    • Working with Text
  • Section 3 - Neural Networks
    • Activation Functions
    • Exercise - Activation Functions
    • Exercise - The Softmax Function
    • The Neuron
    • Two type of applications: Regression and Classification
    • Loss Functions
    • Exercise - Regression Loss Functions
    • Exercise - Classification Loss Functions
    • The Gradient Descent Algorithm
    • Exercise - Implementing Gradient Descent
    • Exercise - PyTorch Autograd
    • Exercise - Regression with Neural Networks
    • Exercise - Classification with Neural Networks
    • Playground - Neural Networks
  • Section 4 - Convolutional Neural Networks
    • Convolution
    • Convolutional Neural Networks
    • Classifying handwritten digits
    • Playground - Convolutional Neural Networks
    • Transfer Learning
  • Final Project - Text Classification
  • Further Resources
    • Computer Vision Libraries
    • Image Classification with PyTorch
    • Object Detection with PyTorch
    • Deep AI Explainability
Powered by GitBook
On this page
  1. Section 1 - The Math

The power of matrix computation

Matrices can be used to perform various geometrical transformations on vectors or points in space. We will show you some common geometrical transformations that can be achieved using matrices.

What is a transformation?

n mathematics, a transformation of a vector refers to a function or operation that takes a vector as input and produces another vector as output. It can be thought of as a rule or procedure that modifies the original vector in some way.

Transformations of vectors are often represented by matrices. A matrix transformation takes a vector and performs a multiplication operation with a matrix, resulting in a new vector. The matrix acts as a set of coefficients that determine how the original vector is transformed.

To better understand this, let's consider a simple example. Suppose we have a 2-dimensional vector v = [x, y] and we want to transform it using a 2x2 matrix A. The matrix A can be written as:

A = | a11 a12 |
    | a21 a22 |

To transform the vector v using matrix A, we perform a matrix-vector multiplication:

Av = | a11 a12 | | x | | a11x + a12y | | a21 a22 | * | y | = | a21x + a22y |

A * v = | a11 * x + a12 * y |
        | a21 * x + a22 * y |

The resulting vector Av is the transformed vector. The matrix A determines how the elements of the original vector are combined to form the elements of the transformed vector.

Matrix transformations can have various effects on vectors. They can stretch or shrink the vector, rotate it, reflect it, or skew it. The specific transformation depends on the values of the matrix elements.

It's worth noting that matrix transformations are not limited to 2-dimensional vectors. They can be applied to vectors of any dimensionality using appropriately sized matrices. The concept of vector transformations is widely used in many areas of mathematics, physics, computer graphics, and other fields.

We will show you now how the matrices for the most common transformations look like. You will also practice these transformations in a notebook.

Translation

Translation involves shifting an object or a point in space by a certain distance in the x, y, and/or z directions. It can be represented using a translation matrix, where the matrix elements correspond to the amount of translation in each direction. For example, a 2D translation matrix for shifting by (tx, ty) would be:

| 1 0 tx |
| 0 1 ty |
| 0 0 1 |

Scaling

Scaling modifies the size of an object or a point. It can be uniformly applied to all dimensions or independently along each axis. Scaling is represented using a scaling matrix, where the diagonal elements represent the scaling factors. For example, a 2D scaling matrix for scaling by (sx, sy) would be:

| sx 0 |
| 0 sy |

Rotation

Rotation transforms an object or a point around a fixed point or the origin. Rotations can be performed in 2D or 3D space and are typically specified by an angle of rotation or rotation matrix. For example, a 2D rotation matrix for rotating by an angle θ would be:

| cos(θ) -sin(θ) |
| sin(θ) cos(θ) |

Shearing

Shearing is a transformation that skews an object or a point in a specified direction. It can be applied along one or more axes. Shearing is represented using a shearing matrix, where the matrix elements control the amount of shearing in each direction.

Reflection

Reflection flips an object or a point across a line or plane. It can be performed along an axis or an arbitrary line. Reflection can be represented using a reflection matrix, which depends on the line or plane of reflection.

Projection

Projection transforms 3D points onto a 2D plane or a lower-dimensional subspace. It is commonly used in computer graphics and computer vision. Projection matrices can be used to project objects or points onto a desired plane or subspace.

Combining transformations

These transformations can be combined by multiplying the corresponding transformation matrices together. By applying these matrices to vectors or points, you can achieve a wide range of geometrical transformations in 2D or 3D space.

PreviousVectors, Matrices and TensorsNextExercise - Matrix Computation