Skip to content
This repository was archived by the owner on Apr 19, 2025. It is now read-only.

🧠 πŸ“Š Classical ML coursework from Machine Learning (HHU, WSβ€―23/24) β€” SVMs, PCA, clustering, regression, decision trees, and neural nets.

Notifications You must be signed in to change notification settings

dpkass/HHU-Machine-Learning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

22 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Machine Learning 2 – Exercise Solutions

Author: Taha El Amine Kassabi
Course: Machine Learning 2 (WS 2023/24)
Instructor: Prof. Dr. Dominik Heider
University: Heinrich Heine University DΓΌsseldorf (HHU)


πŸ“š Overview

This repository contains solutions to all 11 exercises from the Machine Learning 2 course at HHU. Each assignment covers foundational or advanced topics in classical machine learning β€” from matrix manipulation and statistics to support vector machines and neural networks. Most solutions consist of Jupyter notebooks, supplementary markdown or PDF answers, and linked datasets.


πŸ“‚ Repository Structure

Sheets/
β”œβ”€β”€ Sheet 0 – Conda, YAML, file IO
β”œβ”€β”€ Sheet 1 – Text processing & matrices
β”œβ”€β”€ Sheet 2 – Descriptive statistics, correlation
β”œβ”€β”€ Sheet 3 – PCA & MDS (PCoA)
β”œβ”€β”€ Sheet 4 – Clustering (K-Means, Hierarchical)
β”œβ”€β”€ Sheet 5 – Linear and logistic regression
β”œβ”€β”€ Sheet 6 – k-Nearest Neighbors
β”œβ”€β”€ Sheet 7 – Support Vector Machines
β”œβ”€β”€ Sheet 8 – Decision Trees
β”œβ”€β”€ Sheet 9 – Random Forest & Performance Measures
└── Sheet 10 – Neural Nets, Backprop, CNN filters

Each folder includes a task sheet (exercise_N.pdf) and data or helper files. Solutions are under Solutions/Exercise N.


🧠 Topics Covered

Exercise Topics
0 Conda setup, YAML, reading/writing files
1 Text parsing, matrix algebra
2 Descriptive stats, skewness, correlation
3 PCA & classical MDS
4 K-means, silhouettes, hierarchical clustering
5 Linear regression, logistic regression
6 k-Nearest Neighbors (k-NN), normalization
7 SVM, Lagrange multipliers, separating hyperplanes
8 Decision trees, information gain, Gini
9 Random forests, Gini importance, AUC, precision/recall
10 McCulloch/Pitts networks, backpropagation, CNN filters, MNIST training (bonus)

πŸ’Ύ Setup

pip install -r requirements.txt

You may also use JupyterLab or Google Colab for individual notebooks.


πŸš€ Usage

Navigate to any exercise in Solutions/Exercise N, open the notebook(s), and run the cells. Some tasks also include .md, .pdf, or .docx explanations for handwritten answers.


πŸ“ Notes

  • Most exercises were completed using Python with NumPy, Pandas, Matplotlib, Scikit-learn, and **SciPy **.
  • For neural networks (Exercise 10 bonus), models were implemented using Keras/TensorFlow and PyTorch.

About

🧠 πŸ“Š Classical ML coursework from Machine Learning (HHU, WSβ€―23/24) β€” SVMs, PCA, clustering, regression, decision trees, and neural nets.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published