8000 TSNE - n_iter_without_progress not working · Issue #6450 · scikit-learn/scikit-learn · GitHub
[go: up one dir, main page]

Skip to content

TSNE - n_iter_without_progress not working #6450

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
Pfaeff opened this issue Feb 25, 2016 · 13 comments
Closed

TSNE - n_iter_without_progress not working #6450

Pfaeff opened this issue Feb 25, 2016 · 13 comments
Labels
Milestone

Comments

@Pfaeff
Copy link
Pfaeff commented Feb 25, 2016

The _gradient_descent function always uses the default value of 50 for this, no matter what you set it to be. This is bad, especially if you want t-SNE to run for a fixed number of iterations.

It can be fixed by changing the _tsnefunction as follows:

    opt_args = {"n_iter": 50, "momentum": 0.5, "it": 0,
                "learning_rate": self.learning_rate,
                "n_iter_without_progress": self.n_iter_without_progress, # FIXME: this was missing
                "verbose": self.verbose, "n_iter_check": 25,
                "kwargs": dict(skip_num_points=skip_num_points)}

Greetings

@ssaeger
Copy link
ssaeger commented Feb 25, 2016

I think you are right and line 816 seems to be an error too, but can be removed when the previous line is fixed.
Should I create a PR for this?

@Pfaeff
Copy link
Author
Pfaeff commented Feb 25, 2016

If line 816 is wrong, then line 815 probably should be changed as well. Seems like min_grad_norm is also not working?

@ssaeger
Copy link
ssaeger commented Feb 25, 2016

I'm not completely sure if this two lines are intentional, but it seems strange to me that self.n_iter_without_progress and self.min_grad_norm are completely unused and instead there are hardcoded values without explanation (at least one of them is the same as the default for the parameter). Additionally it is not documented that these parameters are fixed if you select the "Barnes-Hut" method.
But in general it should be possible to set these parameters even if they are overwritten for the Barnes-Hut method. So in conclusion I think we have to fix it as you suggested, but for the lines 815 and 816 it would be better to see if someone more familiar with the code has an idea.

@AlexanderFabisch
Copy link
Member

It was a design decision to not expose these parameters to the user. They are set to similar values in most other open source implementations of t-SNE.

@ssaeger
Copy link
ssaeger commented Mar 4, 2016

ok, I understand.
Should we then add this to the documentation? Since the used values might be of interest for some users and its not very transparent that a user can set some parameters but they are overwritten without mentioning it somewhere.

@AlexanderFabisch
Copy link
Member

I don't think that makes sense. The parameters are not mentioned in the user documentation because they cannot be used. Everyone who wants to modify the code can read the code and docstring of the function.

@ssaeger
Copy link
ssaeger commented Mar 5, 2016

Good point, but I still see the problem that a user can set n_iter_without_progress as follows:
model = TSNE(n_iter_without_progress=100)
but it is never used. If method='exact' it is overwritten with 50 when _gradient_descent(...) is called and for method='barnes_hut' it is overwritten with 30.

If this is intended I do not see why we have this parameter, but maybe I'm missing something.
Otherwise I think this can be fixed as suggested by @Pfaeff so that for method='exact' the user defined value is used.

@AlexanderFabisch
Copy link
Member

No, it should not be possible:

In [1]: from sklearn.manifold import TSNE

In [2]: TSNE(n_iter_without_progress=100)
---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-2-95de97ba6c57> in <module>()
----> 1 TSNE(n_iter_without_progress=100)

TypeError: __init__() got an unexpected keyword argument 'n_iter_without_progress'

@ssaeger
Copy link
ssaeger commented Mar 6, 2016

That's strange, cause I do not get this error with the latest code in master.

@AlexanderFabisch
Copy link
Member

OK, I had an old version installed. :) The bug has been introduced either with #5186 to fix #5165 or with #4887. Would be great if you could make a pull request. We should remove this here too. I hope there are not more inconsistencies...

@ssaeger
Copy link
ssaeger commented Mar 6, 2016

ok, I will create a PR.

@sandeepayyar
Copy link
sandeepayyar 8869 commented Oct 27, 2016

Hi,

I am still experiencing the same error when I run the TSNE function:

TypeError: _gradient_descent() got an unexpected keyword argument 'n_iter_check'

Please advise me if there is any help

@amueller
Copy link
Member
amueller commented Nov 1, 2016

@sandeepayyar in the development version or the stable version? This fix is not released yet.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

6 participants
0