@@ -7,20 +7,19 @@ Classifying MNIST digits using Logistic Regression
77==================================================
88
99.. note::
10- This sections assumes the reader is familiar with the following Theano
10+ This sections assumes familiarity with the following Theano
1111 concepts: `shared variables`_ , `basic arithmetic ops`_ , `T.grad`_ .
1212
13- TODO: shared variables documentation not up !!
14-
15- TODO: put shortcuts to the downloads right here (the download for the full source)
16-
13+ .. note::
14+ The code for this section is available for download `here`_.
1715
18- .. _shared variables : http://www.pylearn.org/theano/basic_tutorial
16+ .. _here : http://www.iro.umontreal.ca/~lisa/deep/tutorial/code/logistic_sgd.py
1917
20- .. _basic arithmetic ops : http://www.pylearn.org/theano/basic_tutorial/adding .html
18+ .. _shared variables : http://www.iro.umontreal.ca/~lisa/deep/theanodoc/tutorial/examples .html#using-shared-variables
2119
22- .. _T.grad : http://www.pylearn.org/theano/basic_tutorial/examples .html#computing-gradients
20+ .. _basic arithmetic ops : http://www.iro.umontreal.ca/~lisa/deep/theanodoc/tutorial/adding .html#adding-two-scalars
2321
22+ .. _T.grad: http://www.iro.umontreal.ca/~lisa/deep/theanodoc/tutorial/examples.html#computing-gradients
2423
2524In this section, we show how Theano can be used to implement the most basic
2625classifier: the logistic regression. We start off with a quick primer of the
@@ -362,7 +361,8 @@ error of 7.61%.
362361.. rubric:: Footnotes
363362
364363.. [#f1] For smaller datasets and simpler models, more sophisticated descent
365- algorithms can be more effective. The sample code logistic_cg.py
364+ algorithms can be more effective. The sample code
365+ `logistic_cg.py <http://www.iro.umontreal.ca/~lisa/deep/tutorial/code/logistic_cg.py>`_
366366 demonstrates how to use SciPy's conjugate gradient solver with Theano
367367 on the logistic regression task.
368368
0 commit comments