# Machine Learning Fundamentals - Linear Algebra - Exercise: Matrix and Vector Operations

## Table of Contents

## Introduction

This exercise tests knowledge in basics in linear algebra. Knowledge about matrices, vectors, and their operations are essential in understanding more complex machine learning topics, like neural networks. Safe handling of domain-specific notation and concepts is therefore necessary.

## Requirements

### Knowledge

- Chapter 2 of Deep Learning by Ian Goodfellow gives a brief introduction into the field
- Linear Algebra by Jim Hefferson is a open-source textbook with a lot of good exercises
- Introduction to Linear Algebra by Gilbert Strang ist a good domain specific textbook
- Coding the Matrix: Linear Algebra through Applications to Computer Science by Philip Klein is focused on a computer science viewpoint

### Python Modules

```
# External Modules
import numpy as np
```

### Data

Given are the following matrices:

\begin{equation} A = \begin{pmatrix} 4 & 4 & 5 \\ 2 & 1 & 7 \\ 4 & 8 & 3 \end{pmatrix} , B = \begin{pmatrix} 1 & 6 \\ 3 & 1 \\ 5 & 2 \end{pmatrix} , C = \begin{pmatrix} 1 & 4 & 4 \\ 3 & 1 & 2\\ 6 & 7 & 1 \end{pmatrix} \end{equation}

and following vectors:

\begin{equation} \vec{x} = \begin{pmatrix} 9 \\ 5 \\ 7 \end{pmatrix} , \vec{y} = \begin{pmatrix} 3 \\ 1 \\ 5 \end{pmatrix} \end{equation}

```
# Matrices
A = ([4,4,5],
[2,1,7],
[4,8,3])
B = ([1,6],
[3,1],
[5,2])
C = ([1,4,4],
[3,1,2],
[6,7,1])
# Vectors
x = (9,5,7)
y = (3,1,5)
```

## Pen and Paper Calculation

Perform following calculations by hand or write some latex in that notebook.

- $ \vec{x} \cdot \vec{y} $ (dot or inner product)
- $ \vec{x} * \vec{y}^T $ (matrix product)
- $ A * B $
- $ B * A $
- $ A * C $
- $ C * A $
- $ (C^T * A^T)^T $
- $ A \circ C $ (Hadamard or Schur product)
- $ \left \langle A,C \right \rangle_F $ (Frobenius inner product)

## Implementation of Basic Operations

Implement the following functions using the data structure `List`

only. The results should be the same as the corresponding Numpy implementation.

### Vector Addition

```
def vector_add(a, b):
''' Adds two given vectors a and b:
https://en.wikipedia.org/wiki/Euclidean_vector
params:
a: A list representing vector a
b: A list representing vector b
returns:
[x_1 + y_1, x_2 + y_2, ... , x_n + y_n]
'''
raise NotImplementedError
# Test
np.testing.assert_array_almost_equal(vector_add(x,y), np.add(x,y), verbose=True)
```

### Vector Subtraction

```
def vector_sub(a, b):
''' Subtracts two given vectors a and b:
https://en.wikipedia.org/wiki/Euclidean_vector
params:
a: A list representing vector a
b: A list representing vector b
returns:
[x_1 - y_1, x_2 - y_2, ... , x_n - y_n]
'''
raise NotImplementedError
# Testing
np.testing.assert_array_almost_equal(vector_sub(x,y), np.subtract(x,y), verbose=True)
```

### Scalar Multiplication

```
def scalar_mul(r, A):
''' Multiply each element of a matrix or vector by a scalar 'r': https://en.wikipedia.org/wiki/Scalar_multiplication
params:
r: Scalar
A: Vector or matrix
returns:
A vector or matrix with the same dimesion like 'A' but each element multiplied by r
'''
raise NotImplementedError
# Testing
sca = 3
np.testing.assert_array_almost_equal(scalar_mul(sca,A), np.multiply(sca,A), verbose=True)
```

### Dot Product

```
def vec_dot(a, b):
''' Sum of the product of corresponding elements: https://en.wikipedia.org/wiki/Dot_product
params:
a: Vector
b: Vector
returns:
x_1 * y_1 + x_2 * y_2 + ... + x_n * y_n
'''
raise NotImplementedError
# Testing
np.testing.assert_array_almost_equal(vec_dot(x,y), np.dot(x,y), verbose=True)
```

### Matrix Multiplication

```
def matrix_mult(A, B):
''' Computes the product of two matrices: https://en.wikipedia.org/wiki/Matrix_multiplication
params:
A: Matrix with dimensions NxP
B: Matrix with dimensions PxM
returns:
NxM matrix with each element c_i_j = a_i_1 * b_1_j + ... + a_i_p * b_p_j
'''
raise NotImplementedError
# Testing
np.testing.assert_array_almost_equal(matrix_mult(A,B), np.matmul(A,B), verbose=True)
```

### Transpose

```
def matrix_transpose(A):
''' Flips the matrix over its diagonal, switches the row and column indices of the matrix: https://en.wikipedia.org/wiki/Transpose
params:
A: Matrix
returns:
Transpose A^T of given matrix A
'''
raise NotImplementedError
# Testing
np.testing.assert_array_almost_equal(matrix_transpose(A), np.transpose(A), verbose=True)
```

## Summary and Outlook

This exercise covered basic operations on vectors and matrices. If the exercise was too complicated, consider the sources mentioned above for a recap.

## Literature

## Licenses

### Notebook License (CC-BY-SA 4.0)

*The following license applies to the complete notebook, including code cells. It does however not apply to any referenced external media (e.g., images).*

Exercise: Matrix and Vector Operations

by Benjamin Voigt

is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Based on a work at https://gitlab.com/deep.TEACHING.

### Code License (MIT)

*The following license only applies to code cells of the notebook.*

Copyright 2018 Benjamin Voigt

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.