|
1 | 1 | # Python Machine Learning Notebooks |
2 | 2 | Essential codes for jump-starting machine learning/data science with Python |
3 | 3 |
|
4 | | -### Essential tutorial-type notebooks on Pandas and Numpy |
| 4 | +## Essential tutorial-type notebooks on Pandas and Numpy |
5 | 5 | * Jupyter notebooks covering a wide range of functions and operations on the topics of NumPy, Pandans, Seaborn, matplotlib etc. |
6 | 6 |
|
7 | | -### Tutorial-type notebooks covering regression, classification, clustering, and some basic neural network algorithms |
| 7 | +## Tutorial-type notebooks covering regression, classification, clustering, and some basic neural network algorithms |
| 8 | + |
| 9 | +### Regression |
8 | 10 | * Simple linear regression with t-statistic generation |
9 | 11 | * Multiple ways to do linear regression in Python and their speed comparison ([check the article I wrote on freeCodeCamp](https://medium.freecodecamp.org/data-science-with-python-8-ways-to-do-linear-regression-and-measure-their-speed-b5577d75f8b)) |
10 | 12 | * Multi-variate regression with regularization |
11 | 13 | * Polynomial regression with how to use ***scikit-learn pipeline feature*** ([check the article I wrote on *Towards Data Science*](https://towardsdatascience.com/machine-learning-with-python-easy-and-robust-method-to-fit-nonlinear-data-19e8a1ddbd49)) |
| 14 | +* Decision trees and Random Forest regression (showing how the Random Forest works as a robust/regularized meta-estimator rejecting overfitting) |
| 15 | + |
| 16 | +### Classification |
12 | 17 | * Logistic regression/classification |
13 | 18 | * _k_-nearest neighbor classification |
14 | 19 | * Decision trees and Random Forest Classification |
15 | | -* Decision trees and Random Forest regression (showing how the Random Forest works as a robust/regularized meta-estimator rejecting overfitting) |
16 | 20 | * Support vector machine classification |
| 21 | +* Naive Bayes classification |
| 22 | + |
| 23 | +### Clustering |
17 | 24 | * _K_-means clustering |
18 | 25 |
|
19 | 26 | ### Function approximation by linear model and Deep Learning method |
20 | 27 | * Demo notebook to illustrate the superiority of deep neural network for complex nonlinear function approximation task. |
21 | 28 | * Step-by-step building of 1-hidden-layer and 2-hidden-layer dense network using basic TensorFlow methods |
22 | 29 |
|
23 | | -### Basic interactive controls demo |
| 30 | +## Basic interactive controls demo |
24 | 31 | * Demo on how to integrate basic interactive controls (slider bars, drop-down menus, check-boxes etc.) in a Jupyter notebook and use them for interactive machine learning task |
25 | 32 |
|
26 | 33 | ## Run Jupyter using Docker |
|
0 commit comments