The goal of deep.TEACHING is to improve the qualification of studentes at HTW Berlin - University of Applied Sciences, partnering companies, as well as external users and students in the machine learning domain. The project aims at providing suitable teaching materials for bachelors and masters programs to impart a theoretical foundation and practical experience.
We are developing three real-world scenarios, Medical Image Classification, Robotic / Autonomous Driving and Text Information Extraction, to focus on relevant educational materials. All code samples and exercises are written in Python, the most commonly used programming language in this domain. Code and documentation are embedded in Jupyter notebooks to provide an interactive and explorative environment. Jupyter notebooks store source code and markdown formatted documentation in executable cells. This approach allows us to create a narrative by splitting up complex algorithms into small, digestable pieces. We recommend JupyterLab, a web-based data science IDE, to work with Jupyter notebooks.
The project website provides statically rendered HTML sites to preview the eductional materials. In order to interact with notebooks and to run code, clone the educational-materials Git repository to your own computer (see How to use deep.TEACHING Notebooks for detailed instructions).
If you have questions, please open a GitLab issue.
The following three pages provide a compact introduction to each of the scenarios. Get started with these introductions and choose a topic you are interested in.
Some of the provided notebooks contain exercises without any solutions. If you are a teacher, you can request access to the solutions in a private repository.
All notebooks will go through a two-tier review process. The first stage is called Minor Review and is usually carried out by a research assistant in the deep.TEACHING team. The secong stage is a Major Review, carried out by a more experienced machine learning expert or professor. The review state of each notebook can be inspected in the extensive Notebook List and a detailed description of the two-tier process can be found in the Review Guide.
How to use deep.TEACHING Notebooks
This tutorial explains how to use the Jupyter notebook teaching materials on your own computer. We recommend using Python 3.6 on a Linux distribution like Fedora 27.
graphviz on your computer.
On Fedora run:
sudo dnf install python3-pip git graphviz
On Ubuntu run:
sudo apt update sudo apt install python3-pip git graphviz
If the software repositories of your prefered Linux distribution does not provide Python 3.6 consider using pyenv. We are using pipenv for dependency resolution, which will make use of
pyenv automatically, if installed properly.
If you are on Windows follow the Windows Setup instructions.
After installing the system packages, use the following commands to get up and running.
# Install pipenv and jupyterlab in user's home directory pip3 install --user pipenv jupyterlab # Clone repository via git git clone https://gitlab.com/deep.TEACHING/educational-materials.git cd educational-materials # Create a new virtual environment and install dependencies from Pipfile.lock pipenv install # Create an ipython kernel for the virtual environment pipenv run ipython kernel install --user --name deep_teaching_kernel # Run Jupyter Lab to navigate through materials jupyter lab # When opening a notebook with Jupyter Lab, select deep_teaching_kernel (upper right corner)
pyenv) ensures a reproducible setup across platforms and protects us from breaking changes in our Python
We would like to build a community of contributors around this project, to collect and improve teaching materials in the machine learning domain. Pull requests are accepted via https://gitlab.com/deep.TEACHING. Please take a look at the Developer Guide to ensure a consistent quality of the materials.
All materials collected in this repository are meant to be easily accessible and can be used, shared and edited by everyone.
Notebooks: Each Juypter notebook contains an individual License header to honor the authors of the corresponding notebook. All notebooks are distributed under a CC-BY-SA 4.0 license. This applies to an entire notebook, including code cells, but excluding any external media (e.g. images).
Code: Code cells included in the notebooks are dual-licensed as CC-BY-SA 4.0 (as stated in the Notebooks section) and the MIT license. The MIT license provides an easy way to reuse code in other software projects, where a Creative Commons license would not be suitable.
Media: The media directory contains subdirectories named after each content creator. A LICENSE.md is included in each of these subdirectories, to individually honor content creators by name.