8000 [MRG] Added missing sag_solver documentation by patrick1011 · Pull Request #10172 · scikit-learn/scikit-learn · GitHub
[go: up one dir, main page]

Skip to content

[MRG] Added missing sag_solver documentation #10172

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Nov 23, 2017

Conversation

patrick1011
Copy link
Contributor
@patrick1011 patrick1011 commented Nov 19, 2017

Reference Issues/PRs

Fixes #10150 - beta parameter not documented in sag solver.

What does this implement/fix? Explain your changes.

Updated docs for alpha and beta parameters.

Other comments

@arthurmensch was mentioned in original issue if you'd like to take a look!

@patrick1011 patrick1011 changed the title added missing SAGA documentation added missing sag_solver documentation Nov 19, 2017
@patrick1011 patrick1011 changed the title added missing sag_solver documentation Added missing sag_solver documentation Nov 19, 2017
@patrick1011 patrick1011 changed the title Added missing sag_solver documentation [MRG] Added missing sag_solver documentation Nov 20, 2017
Copy link
Member
@glemaitre glemaitre left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nitpicks. LGTM

Defaults to 1.

beta : float, optional
Constant that multiplies the l1 regularization term. Only applied
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you replace the double spaces by a single space after each full stop.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

l1 -> L1 (to be consistent with your previous description)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

but not pushed?

@amueller
Copy link
Member

does that mean we have an implementation of #8288 but no interface to it?!

@amueller
Copy link
Member

rephrased in terms of docs: can alpha and beta be != 0 at the same time for the current implementation or are they mutually exclusive?

Copy link
Member
@jnothman jnothman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@jnothman
Copy link
Member

Sorry I missed @amueller's comments :/

@amueller
Copy link
Member

I mean that doesn't mean it's not correct, I just wasn't sure. I'm not familiar with the implementation (and it wasn't entirely obvious to me).

@@ -131,7 +131,12 @@ def sag_solver(X, y, sample_weight=None, loss='log', alpha=1., beta=0.,
*loss='multinomial'*

alpha : float, optional
Constant that multiplies the regularization term. Defaults to 1.
Constant that multiplies the squared L2 regularization term.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

L2 regularization term in the objective function (.5 * alpha * || W ||_F^2)

Defaults to 1.

beta : float, optional
Constant that multiplies the L1 regularization term. Only applied
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

L2 regularization term in the objective function (beta * || W ||_1)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess @arthurmensch meant L1 here and not L2.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah think so.

@arthurmensch
Copy link
Contributor
arthurmensch commented Nov 21, 2017

@amueller yes basically we already have an elastic net regularization for LogisticRegression.
I can add it if that has been user-requested.

@amueller
Copy link
Member

oh wow! Why don't we expose that lol?

@lesteve
Copy link
Member
lesteve commented Nov 22, 2017

@patrick1011 can you tackle @arthurmensch comments and then I think we can merge this one.

@patrick1011
Copy link
Contributor Author

@lesteve Ah sorry didn't realise I was to update anything. You would prefer me to use the comments he made as the documentation? If so can do!

@lesteve
Copy link
Member
lesteve commented Nov 22, 2017

@lesteve Ah sorry didn't realise I was to update anything. You would prefer me to use the comments he made as the documentation? If so can do!

That'd be great indeed if you could use @arthurmensch comments for the docstrings.

@patrick1011
Copy link
Contributor Author

Hope that's okay now!

@lesteve
Copy link
Member
lesteve commented Nov 23, 2017

I pushed a small tweak. Merging thanks a lot @patrick1011!

@lesteve lesteve merged commit 9808810 into scikit-learn:master Nov 23, 2017
jwjohnson314 pushed a commit to jwjohnson314/scikit-learn that referenced this pull request Dec 18, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

beta parameter not documented in sag_solver
6 participants
0