Skip to content

Commit 3e7ef89

Browse files
committed
Merge branch 'master' of git@github.com:lisa-lab/DeepLearningTutorials
2 parents 79c90fa + c3ac123 commit 3e7ef89

2 files changed

Lines changed: 18 additions & 13 deletions

File tree

doc/logreg.txt

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -7,20 +7,19 @@ Classifying MNIST digits using Logistic Regression
77
==================================================
88

99
.. note::
10-
This sections assumes the reader is familiar with the following Theano
10+
This sections assumes familiarity with the following Theano
1111
concepts: `shared variables`_ , `basic arithmetic ops`_ , `T.grad`_ .
1212

13-
TODO: shared variables documentation not up !!
14-
15-
TODO: put shortcuts to the downloads right here (the download for the full source)
16-
13+
.. note::
14+
The code for this section is available for download `here`_.
1715

18-
.. _shared variables: http://www.pylearn.org/theano/basic_tutorial
16+
.. _here: http://www.iro.umontreal.ca/~lisa/deep/tutorial/code/logistic_sgd.py
1917

20-
.. _basic arithmetic ops: http://www.pylearn.org/theano/basic_tutorial/adding.html
18+
.. _shared variables: http://www.iro.umontreal.ca/~lisa/deep/theanodoc/tutorial/examples.html#using-shared-variables
2119

22-
.. _T.grad: http://www.pylearn.org/theano/basic_tutorial/examples.html#computing-gradients
20+
.. _basic arithmetic ops: http://www.iro.umontreal.ca/~lisa/deep/theanodoc/tutorial/adding.html#adding-two-scalars
2321

22+
.. _T.grad: http://www.iro.umontreal.ca/~lisa/deep/theanodoc/tutorial/examples.html#computing-gradients
2423

2524
In this section, we show how Theano can be used to implement the most basic
2625
classifier: the logistic regression. We start off with a quick primer of the
@@ -362,7 +361,8 @@ error of 7.61%.
362361
.. rubric:: Footnotes
363362

364363
.. [#f1] For smaller datasets and simpler models, more sophisticated descent
365-
algorithms can be more effective. The sample code logistic_cg.py
364+
algorithms can be more effective. The sample code
365+
`logistic_cg.py <http://www.iro.umontreal.ca/~lisa/deep/tutorial/code/logistic_cg.py>`_
366366
demonstrates how to use SciPy's conjugate gradient solver with Theano
367367
on the logistic regression task.
368368

doc/mlp.txt

Lines changed: 9 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -11,13 +11,18 @@ Multilayer Perceptron
1111
`T.tanh`_, `shared variables`_, `basic arithmetic ops`_, `T.grad`_,
1212
:ref:`L1_L2_regularization`.
1313

14-
.. _T.tanh: http://www.pylearn.org/theano/basic_tutorial/examples.html?highlight=tanh#logistic-function
14+
.. note::
15+
The code for this section is available for download `here`_.
16+
17+
.. _here: http://www.iro.umontreal.ca/~lisa/deep/tutorial/code/mlp.py
18+
19+
.. _T.tanh: http://www.iro.umontreal.ca/~lisa/deep/theanodoc/tutorial/examples.html?highlight=tanh
1520

16-
.. _shared variables: http://www.pylearn.org/theano/basic_tutorial
21+
.. _shared variables: http://www.iro.umontreal.ca/~lisa/deep/theanodoc/tutorial/examples.html#using-shared-variables
1722

18-
.. _basic arithmetic ops: http://www.pylearn.org/theano/basic_tutorial/adding.html
23+
.. _basic arithmetic ops: http://www.iro.umontreal.ca/~lisa/deep/theanodoc/tutorial/adding.html#adding-two-scalars
1924

20-
.. _T.grad: http://www.pylearn.org/theano/basic_tutorial/examples.html#computing-gradients
25+
.. _T.grad: http://www.iro.umontreal.ca/~lisa/deep/theanodoc/tutorial/examples.html#computing-gradients
2126

2227

2328
The next architecture we are going to present using Theano is the single-hidden

0 commit comments

Comments
 (0)