Skip to content

Commit efe2eaf

Browse files
committed
Edits suggested by @lamblin.
1 parent 26d3402 commit efe2eaf

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

doc/mlp.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -55,7 +55,7 @@ follows:
5555
.. figure:: images/mlp.png
5656
:align: center
5757

58-
Formally, a one-hidden-layer MLP computes a function :math:`f: R^D \rightarrow
58+
Formally, a one-hidden-layer MLP is a function :math:`f: R^D \rightarrow
5959
R^L`, where :math:`D` is the size of input vector :math:`x` and :math:`L` is
6060
the size of the output vector :math:`f(x)`, such that, in matrix notation:
6161

@@ -132,7 +132,7 @@ to use something else.
132132

133133
If you look into theory this class implements the graph that computes
134134
the hidden layer value :math:`h(x) = \Phi(x) = s(b^{(1)} + W^{(1)} x)`.
135-
If you give this value as input to the ``LogisticRegression`` class,
135+
If you give this graph as input to the ``LogisticRegression`` class,
136136
implemented in the previous tutorial :doc:`logreg`, you get the output
137137
of the MLP. You can see this in the following short implementation of
138138
the ``MLP`` class.

0 commit comments

Comments
 (0)