10000 Exporting a single node DecisionTreeClassifier to dot graph raises error when filled is True · Issue #6580 · scikit-learn/scikit-learn · GitHub
[go: up one dir, main page]

Skip to content

Exporting a single node DecisionTreeClassifier to dot graph raises error when filled is True #6580

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
raghavrv opened this issue Mar 22, 2016 · 4 comments
Labels
Easy Well-defined and straightforward way to resolve

Comments

@raghavrv
Copy link
Member

Code to reproduce -

from sklearn.tree import DecisionTreeClassifier
from sklearn.tree import export_graphviz
import numpy as np

dtc = DecisionTreeClassifier().fit(np.random.random_sample((20, 2)), np.zeros((20, 1)))
export_graphviz(dtc, filled=True)

This is because the max_impurity and min_impurity are the same for a single node tree (colors['bounds'][0] == colors['bounds'][1]).

We should set the color in this case to a single value directly.

@agramfort agramfort added Easy Well-defined and straightforward way to resolve Need Contributor labels Mar 22, 2016
@andrewshir
Copy link

Can I take this one?

@agramfort
Copy link
Member
agramfort commented Mar 22, 2016 via email

@raghavrv
Copy link
Member Author

I am sorry. I just now realized that this issue is a duplicate of #6352 for which there is a PR #6376 ;(

@agramfort do we close this one? (There is a PR for this issue at #6582)

@TomDLT
Copy link
Member
TomDLT commented Mar 24, 2016

closing as #6376 is merged

@TomDLT TomDLT closed this as completed Mar 24, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Easy Well-defined and straightforward way to resolve
Projects
None yet
Development

No branches or pull requests

4 participants
0