Skip to content
Open
Changes from 1 commit
Commits
Show all changes
33 commits
Select commit Hold shift + click to select a range
53713c9
Fix for issue #6352
tracer0tong Feb 17, 2016
65a2b8f
Fixed codestyle
tracer0tong Feb 18, 2016
eb242c2
ENH: FeatureHasher now accepts string values.
devashishd12 Jan 15, 2016
876f123
Do not ignore files starting with _ in nose
lesteve Feb 29, 2016
2e7d9ad
FIX: improve docs of randomized lasso
Mar 7, 2016
bf81451
Fix consistency in docs and docstring
Mar 7, 2016
a754e09
Added ref to Bach and improved docs
Mar 7, 2016
f81e5aa
Try to fix link to pdf
Mar 7, 2016
3a83071
fix x and y order
Mar 7, 2016
4eca0c9
updated info for cross_val_score
ohld Mar 14, 2016
c2eaf75
Merge pull request #6173 from dsquareindia/featurehasher_fix
MechCoder Mar 19, 2016
b64e992
Merge pull request #6542 from ohld/make_scorer-link
glouppe Mar 19, 2016
e2e6bde
Merge pull request #6498 from clamus/rand-lasso-fix-6493
glouppe Mar 19, 2016
9691824
Merge pull request #6466 from lesteve/nose-ignore-files-tweak
glouppe Mar 19, 2016
7580746
Fix broken link in ABOUT
bryandeng Mar 20, 2016
bd6b313
Merge pull request #6565 from bryandeng/doc-link
agramfort Mar 20, 2016
549474d
[gardening] Fix NameError ("estimator" not defined). Remove unused va…
practicalswift Mar 20, 2016
e228581
Merge pull request #6566 from practicalswift/fix-nameerror-and-remove…
jnothman Mar 21, 2016
e9492b7
LabelBinarizer single label case now works for sparse and dense case
devashishd12 Jan 24, 2016
528533d
MAINT: Simplify n_features_to_select in RFECV
MechCoder Mar 21, 2016
945cb7e
Merge pull request #6221 from dsquareindia/LabelBinarizer_fix
MechCoder Mar 21, 2016
54af09e
Fixing typos in logistic regression docs
hlin117 Mar 21, 2016
146f461
Merge pull request #6575 from hlin117/logregdocs
agramfort Mar 22, 2016
07a6433
Fix typo in html target
Mar 22, 2016
b3c2219
Add the possibility to add prior to Gaussian Naive Bayes
Jan 18, 2016
5a046c7
Update whatsnew
MechCoder Mar 22, 2016
5d92bd5
Merge pull request #6579 from nlathia/issue-6541
TomDLT Mar 23, 2016
65b570b
Update scorer.py
lizsz Mar 19, 2016
56d625f
Merge pull request #6569 from MechCoder/minor
TomDLT Mar 23, 2016
22d7cd5
Make dump_svmlight_file support sparse y
yenchenlin Feb 18, 2016
eed5fc5
Merge pull request #6395 from yenchenlin1994/make-dump_svmlight_file-…
TomDLT Mar 23, 2016
afc058f
Merge pull request #6376 from tracer0tong/issue_6352
TomDLT Mar 24, 2016
612cd9e
ENH: Support data centering in LogisticRegression
kernc Mar 17, 2016
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
Fixing typos in logistic regression docs
  • Loading branch information
hlin117 committed Mar 21, 2016
commit 54af09e6f13a0f54d987deaa77f689e498eef015
34 changes: 18 additions & 16 deletions sklearn/linear_model/logistic.py
Original file line number Diff line number Diff line change
Expand Up @@ -499,7 +499,7 @@ def logistic_regression_path(X, y, pos_class=None, Cs=10, fit_intercept=True,

The "balanced" mode uses the values of y to automatically adjust
weights inversely proportional to class frequencies in the input data
as ``n_samples / (n_classes * np.bincount(y))``
as ``n_samples / (n_classes * np.bincount(y))``.

Note that these weights will be multiplied with sample_weight (passed
through the fit method) if sample_weight is specified.
Expand All @@ -514,12 +514,13 @@ def logistic_regression_path(X, y, pos_class=None, Cs=10, fit_intercept=True,
'sag' and 'lbfgs' solvers support only l2 penalties.

intercept_scaling : float, default 1.
This parameter is useful only when the solver 'liblinear' is used
Useful only when the solver 'liblinear' is used
and self.fit_intercept is set to True. In this case, x becomes
[x, self.intercept_scaling],
i.e. a "synthetic" feature with constant value equals to
i.e. a "synthetic" feature with constant value equal to
intercept_scaling is appended to the instance vector.
The intercept becomes intercept_scaling * synthetic feature weight
The intercept becomes ``intercept_scaling * synthetic_feature_weight``.

Note! the synthetic feature weight is subject to l1/l2 regularization
as all other features.
To lessen the effect of regularization on synthetic feature weight
Expand Down Expand Up @@ -841,7 +842,7 @@ def _log_reg_scoring_path(X, y, train, test, pos_class=None, Cs=10,
n_samples > n_features.

intercept_scaling : float, default 1.
This parameter is useful only when the solver 'liblinear' is used
Useful only when the solver 'liblinear' is used
and self.fit_intercept is set to True. In this case, x becomes
[x, self.intercept_scaling],
i.e. a "synthetic" feature with constant value equals to
Expand Down Expand Up @@ -989,13 +990,14 @@ class LogisticRegression(BaseEstimator, LinearClassifierMixin,
Specifies if a constant (a.k.a. bias or intercept) should be
added to the decision function.

intercept_scaling : float, default: 1
Useful only if solver is liblinear.
when self.fit_intercept is True, instance vector x becomes
intercept_scaling : float, default 1.
Useful only when the solver 'liblinear' is used
and self.fit_intercept is set to True. In this case, x becomes
[x, self.intercept_scaling],
i.e. a "synthetic" feature with constant value equals to
i.e. a "synthetic" feature with constant value equal to
intercept_scaling is appended to the instance vector.
The intercept becomes intercept_scaling * synthetic feature weight
The intercept becomes ``intercept_scaling * synthetic_feature_weight``.

Note! the synthetic feature weight is subject to l1/l2 regularization
as all other features.
To lessen the effect of regularization on synthetic feature weight
Expand All @@ -1007,7 +1009,7 @@ class LogisticRegression(BaseEstimator, LinearClassifierMixin,

The "balanced" mode uses the values of y to automatically adjust
weights inversely proportional to class frequencies in the input data
as ``n_samples / (n_classes * np.bincount(y))``
as ``n_samples / (n_classes * np.bincount(y))``.

Note that these weights will be multiplied with sample_weight (passed
through the fit method) if sample_weight is specified.
Expand Down Expand Up @@ -1346,7 +1348,7 @@ class LogisticRegressionCV(LogisticRegression, BaseEstimator,

The "balanced" mode uses the values of y to automatically adjust
weights inversely proportional to class frequencies in the input data
as ``n_samples / (n_classes * np.bincount(y))``
as ``n_samples / (n_classes * np.bincount(y))``.

Note that these weights will be multiplied with sample_weight (passed
through the fit method) if sample_weight is specified.
Expand Down Expand Up @@ -1425,13 +1427,13 @@ class LogisticRegressionCV(LogisticRegression, BaseEstimator,
Stochastic Average Gradient descent solver for 'multinomial' case.

intercept_scaling : float, default 1.
Useful only if solver is liblinear.
This parameter is useful only when the solver 'liblinear' is used
Useful only when the solver 'liblinear' is used
and self.fit_intercept is set to True. In this case, x becomes
[x, self.intercept_scaling],
i.e. a "synthetic" feature with constant value equals to
i.e. a "synthetic" feature with constant value equal to
intercept_scaling is appended to the instance vector.
The intercept becomes intercept_scaling * synthetic feature weight
The intercept becomes ``intercept_scaling * synthetic_feature_weight``.

Note! the synthetic feature weight is subject to l1/l2 regularization
as all other features.
To lessen the effect of regularization on synthetic feature weight
Expand Down