8000 [MRG + 1] add partial_fit to multioutput module by yupbank · Pull Request #8054 · scikit-learn/scikit-learn · GitHub
[go: up one dir, main page]

Skip to content

[MRG + 1] add partial_fit to multioutput module #8054

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 21 commits into from
Jan 5, 2017

Conversation

yupbank
Copy link
Contributor
@yupbank yupbank commented Dec 14, 2016

Reference Issue

Fixes: #8053

What does this implement/fix? Explain your changes.

add partial_fit

Any other comments?

@amueller
Copy link
Member

tests are failing. Also, please use if_delegate_has_method

@yupbank
Copy link
Contributor Author
yupbank commented Dec 14, 2016

sorry about the typo.. and hmm can you explain more on using if_delegate_has_method ?

@amueller
Copy link
Member

do a git grep to see how it's used. Basically we want hasattr(est, "partial_fit") to fail if partial_fit is not implemented. So you want partial_fit in the MultiOutputEstimator to raise an AttributeError if the estimator doesn't support partial_fit.

@yupbank
Copy link
Contributor Author
yupbank commented Dec 14, 2016

i see you point. more efforts needed, since current implementation doesn't consider about the reuse of self.estimators_, a simple clone to Parallel job is why the test cases failing..

@yupbank yupbank force-pushed the add-partial-fit-to-multioutout branch from 9ac08c2 to 7b42573 Compare December 14, 2016 22:24
@yupbank
Copy link
Contributor Author
yupbank commented Dec 17, 2016

@amueller can i have another comment?

class MultiOutputEstimator(six.with_metaclass(ABCMeta, BaseEstimator)):

def __init__(self, estimator, n_jobs=1):
self.estimator = estimator
self.n_jobs = n_jobs

@if_delegate_has_method('estimator')
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi, while you are at it, can you please add the @if_delegate_has_method for the fit and predict functions too ? Thanks.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure thing

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, we do not need if_delegate_has_method for fit. It is required for every estimator. I think in this context we are dealing with predictors and we do not need if_delegate_has_method for predict.

We also do not need the scope of the PR to be expanded unnecessarily. Throwing in a decorator means that decoration needs testing for instance.

Sorry for the unnecessary work, @yupbank

@@ -83,8 +164,10 @@ def fit(self, X, y, sample_weight=None):
raise ValueError("Underlying regressor does not support"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A small nitpick, could you replace regressor with estimator since it is the base class. Thanks again !

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks and will do

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, this was my mistake :)

else:
estimator = copy.copy(estimator)

if sample_weight is not None:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was wondering if

estimator.partial_fit(X, y, classes=classes, sample_weight=sample_weight)

would suffice since partial_fit function of the the base estimator would handle it appropriately. Let me know if it sounds okay !

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since the regressor and classifier is sharing same interface, so choice of argument for classes is non-neglect-able, but for sample_weight i might need to check, is it universal for sgd estimator

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The fit in BaseSGD only take X, y, so i think i will keep this to make sure all the SGD estimator use this code..

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi, thanks for looking into it. I understand that classes argument is not applicable for regressors. But in case someone inadvertently passes classes argument with a Regressor, the estimator would throw an error right ? Perhaps we can have separate partial_fit function in the sub classes MultiOutputRegressor and MultiOutputClassifier ? Just my 2c.

Also BaseSGD has an abstract definition for fit, which is implemented in the subclasses BaseSGDClassifier and BaseSGDRegressor right ?

Please do let me know what you think ? Thanks.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi, sorry I missed the partial_fit function you have added to MultiOutputRegressor. And also since it is a helper function, I get the reason for separate code paths based on classes. Sorry for the noise..

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

classes is required for classifiers' partial_fit, but I think we have no requirement that sample_weight be supported.

# train the multi_target_linear and also get the predictions.
half_index = int(X.shape[0] / 2)
multi_target_linear.partial_fit(
X[:half_index], y[:half_index], classes=classes)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should also check check:

  • sample weight
  • passing classes=None raises appropriate error

half_index = 25
for n in range(3):
sgr = SGDRegressor(random_state=0)
sgr.partial_fit(X_train[:half_index], y_train[:half_index, n])
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You also need to test sample_weight

Data.

y : (sparse) array-like, shape (n_samples, n_outputs)
Multi-output targets. An indicator matrix turns on multilabel
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Multilabel is not appropriate here.

def __init__(self, estimator, n_jobs=1):
super(MultiOutputRegressor, self).__init__(estimator, n_jobs)

def partial_fit(self, X, y, sample_weight=None):
""" Fit linear model with Stochastic Gradient Descent..
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Huh?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

will update

@@ -68,7 +148,8 @@ def fit(self, X, y, sample_weight=None):
"""

if not hasattr(self.estimator, "fit"):
raise ValueError("The base estimator should implement a fit method")
raise ValueError(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These cosmetic fixes make it much harder to review your work.

classes : array, shape (n_classes, n_outputs)
Classes across all calls to partial_fit.
Can be obtained by via
`[np.unique(y[:, i]) for i in range(y.shape[1])]`, where y is the
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is only going to work if there are the same number of classes in each output.

Copy link
Contributor Author
@yupbank yupbank Dec 19, 2016

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the code is documentation is going to work even with different number of classes in each output.
but yeah, i need to change https://github.com/scikit-learn/scikit-learn/pull/8054/files#diff-66150694b846268fe58229e40db8b909R122 into classes [i] if classes is not None else None.

import numpy as np
y = np.array([[0,1,2,0], [0, 1, 0, 1], [1, 0, 1, 0]]).T
In [11]: y
Out[11]:
array([[0, 0, 1],
       [1, 1, 0],
       [2, 0, 1],
       [0, 1, 0]])

[np.unique(y[:, i]) for i in range(y.shape[1])]

[array([0, 1, 2]), array([0, 1]), array([0, 1])]

Multi-output targets. An indicator matrix turns on multilabel
estimation.

classes : array, shape (n_classes, n_outputs)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should specify, "of int or string" and perhaps should not be array to allow for different numbers of classes in each output.

Data.

y : (sparse) array-like, shape (n_samples, n_outputs)
Multi-output targets. An indicator matrix turns on multilabel
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It doesn't really turn on anything. Multilabel is binary multioutput by definition.

class MultiOutputEstimator(six.with_metaclass(ABCMeta, BaseEstimator)):

def __init__(self, estimator, n_jobs=1):
self.estimator = estimator
self.n_jobs = n_jobs

@if_delegate_has_method('estimator')
def partial_fit(self, X, y, classes=None, sample_weight=None):
""" Fit linear model with Stochastic Gradient Descent..
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Huh?

if first_time:
estimator = clone(estimator)
else:
estimator = copy.copy(estimator)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is an interesting implementation detail. I acknowledge that copying like this is the only easy way to maintain consistent behaviour whether or not parallelism is used. However I think it should be a deepcopy if anything. And I suppose you should ensure this behaviour in tests: that across multiple calls to partial_fit the same estimator objects are not maintained.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i don't think deepcopy is required, it is a waste of memory, the joblib.Parallel is going to take care of it to ensure not job is interfering with each other's input.

 d = [0]
a = [d, d, d]
def change_value(x, i):
    x[0][0] = i
    return x

w = Parallel(n_jobs=1, backend='threading')(delayed(change_value)(a, n) for n in xrange(3))

w
[[0], [0], [0]]
[[1], [1], [1]]
[[2], [2], [2]]

w = Parallel(n_jobs=2, backend='multiprocessing')(delayed(change_value)(a, n) for n in xrange(3))
w
[[0], [0], [0]]
[[1], [1], [1]]
[[2], [2], [2]]

w = []
for n in xrange(3):
    w.append(change_value(a, n))

In [67]: w
Out[67]: [[[2], [2], [2]], [[2], [2], [2]], [[2], [2], [2]]]

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

and yeah.. the test case need to be updated

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

joblib.Parallel will not take care of it in the n_jobs=1 case...

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry I take that back.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So if joblib is performing a copy regardless, are you sure we need this copy.copy?

Copy link
Member
@jnothman jnothman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Otherwise LGTM

if first_time:
estimator = clone(estimator)
else:
estimator = copy.copy(estimator)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So if joblib is performing a copy regardless, are you sure we need this copy.copy?

class MultiOutputEstimator(six.with_metaclass(ABCMeta, BaseEstimator)):

def __init__(self, estimator, n_jobs=1):
self.estimator = estimator
self.n_jobs = n_jobs

@if_delegate_has_method('estimator')
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, we do not need if_delegate_has_method for fit. It is required for every estimator. I think in this context we are dealing with predictors and we do not need if_delegate_has_method for predict.

We also do not need the scope of the PR to be expanded unnecessarily. Throwing in a decorator means that decoration needs testing for instance.

Sorry for the unnecessary work, @yupbank

class MultiOutputEstimator(six.with_metaclass(ABCMeta, BaseEstimator)):

def __init__(self, estimator, n_jobs=1):
self.estimator = estimator
self.n_jobs = n_jobs

@if_delegate_has_method('estimator')
def partial_fit(self, X, y, classes=None, sample_weight=None):
""" Fit the model to data with Stochastic Gradient Descent.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Still, SGD?

def __init__(self, estimator, n_jobs=1):
super(MultiOutputRegressor, self).__init__(estimator, n_jobs)

def partial_fit(self, X, y, sample_weight=None):
""" Fit the model to data with Stochastic Gradient Descent..
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

SGD?

def test_mutli_output_classifiation_partial_fit_no_first_classes_exception():
sgd_linear_clf = SGDClassifier(loss='log', random_state=1)
multi_target_linear = MultiOutputClassifier(sgd_linear_clf)
assert_raises_regex(ValueError, "classes must be passed on the first call to partial_fit.",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PEP8 line length

@jnothman
Copy link
Member

I looked over the tests yesterday, but can't recall if I checked for one that ensured that

mor.partial_fit(X, y)
est1 = mor.estimators_[0]
mor.partial_fit(X, y)
est2 = mor.estimators_[0]
# parallelism requires this to be the case for a sane implementation
assert_false(est1 is est2)

regardless of n_jobs == 1 or n_jobs > 1? This is a subtlety but your copy code reminded me it's worth ensuring.

class MultiOutputEstimator(six.with_metaclass(ABCMeta, BaseEstimator)):

def __init__(self, estimator, n_jobs=1):
self.estimator = estimator
self.n_jobs = n_jobs

@if_delegate_has_method('estimator')
def partial_fit(self, X, y, classes=None, sample_weight=None):
""" Fit the model to data with SGD.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, what I'm saying is that it's not always SGD. Why do you mention SGD??

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oh man... i was so obsessed with sgd.. would delete that... i was heavily working on sgd related projects recently, sorry about that...

def __init__(self, estimator, n_jobs=1):
super(MultiOutputRegressor, self).__init__(estimator, n_jobs)

def partial_fit(self, X, y, sample_weight=None):
""" Fit the model to data with SGD.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, what I'm saying is that it's not always SGD. Why do you mention SGD??

@yupbank yupbank force-pushed the add-partial-fit-to-multioutout branch from 28031e0 to 59261e2 Compare December 22, 2016 18:06
@maniteja123
Copy link
Contributor

Sorry for the digression @yupbank and @jnothman . I just thought the check can be removed if the decorator can be used.

Copy link
Member
@jnothman jnothman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Apart from nitpicks, LGTM.

Please add an "Enhancement" entry in whats_new.rst. Thanks!

class MultiOutputEstimator(six.with_metaclass(ABCMeta, BaseEstimator)):

def __init__(self, estimator, n_jobs=1):
self.estimator = estimator
self.n_jobs = n_jobs

@if_delegate_has_method('estimator')
def partial_fit(self, X, y, classes=None, sample_weight=None):
""" Fit the model to data.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Say "Incrementally fit". Also remove space between quotes and words.

@jnothman jnothman changed the title add partial_fit to multioupt module [MRG+1] add partial_fit to multioutput module Dec 23, 2016
Copy link
Member
@raghavrv raghavrv left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot for the PR! A few minor comments and questions.

if first_time:
estimator = clone(estimator)
else:
estimator = copy.deepcopy(estimator)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why? Sorry if this question was answered before...

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Once the estimators are cloned, can they not be fitted in parallel? I don't understand why you need to copy them? Maybe I'm missing something...

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it has been answered. it is to ensure fitting in different "process" on different estimators, and partial fit requires being called multiple times

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, you will be partial-fitting one estimator per process right? Unless you are using the same estimator in 2 processes, I don't see the problem here... Maybe I'm just completely blind to some detail. Would you be kind to clarify please?

The way I see it - we have one estimator per target. Say there are 2 target, we have est1 and est2. And if you give n_jobs>=2. est1 will partial_fit on process 1 and est2 on process 2 correct? Both of these are cloned from the original est instance and hence should train on the required batch of data without any side effects from parallelism...

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oh... yeah, when doing the second fit, i have already separated estimators. no clone/copy is needed anymore.

Thanks for the catch 👍

@@ -44,6 +44,10 @@ New features
Enhancements
............

- :class:`multioutput.MultiOutputRegressor` and :class:`multioutput.MultiOutputClassifier`
now support incremental fit using `partial_fit`.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"support online learning using partial_fit" sounds sexy? (nitpick of the highest order. Feel free to ignore)

Multi-output targets.

classes : list of numpy arrays, shape (n_outputs)
each array is unique classes for one output in str/int
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Each (capital E)

classes : list of numpy arrays, shape (n_outputs)
each array is unique classes for one output in str/int
Can be obtained by via
`[np.unique(y[:, i]) for i in range(y.shape[1])]`, where y is the
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Double backticks

self : object
Returns self.
"""

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

blank line can be removed


def test_multi_output_classification_partial_fit_parallelism():
sgd_linear_clf = SGDClassifier(loss='log', random_state=1)
mor = MultiOutputClassifier(sgd_linear_clf)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is n_jobs not set for parallel test?

sgd_linear_clf.partial_fit(
X[:half_index], y[:half_index, i], classes=classes[i])
sgd_linear_clf.partial_fit(X[half_index:], y[half_index:, i])
assert_equal(list(sgd_linear_clf.predict(X)), list(predictions[:, i]))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you can use assert_array_equal here...

multi_target_linear = MultiOutputClassifier(sgd_linear_clf)

# train the multi_target_linear and also get the predictions.
half_index = int(X.shape[0] / 2)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you could use X.shape[0] // 2 instead of using int...

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(If not already done, use from __future__ import division at top of file)

half_index = int(X.shape[0] / 2)
multi_target_linear.partial_fit(
X[:half_index], y[:half_index], classes=classes)
multi_target_linear.partial_fit(X[half_index:], y[half_index:])
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would it be better to test if the predictions are same for both multi_target_linear and sgd_linear_clf at each partial fit?

clf = MultiOutputClassifier(sgd_linear_clf)
clf.fit(X, y)
X_test = [[1.5, 2.5, 3.5]]
assert_almost_equal(clf.predict(X_test), clf_w.predict(X_test))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

assert_array_almost_equal

@raghavrv raghavrv changed the title [MRG+1] add partial_fit to multioutput module [MRG + 1] add partial_fit to multioutput module Jan 3, 2017
@raghavrv raghavrv added this to the 0.19 milestone Jan 3, 2017
@@ -15,13 +15,15 @@
# License: BSD 3 clause

import numpy as np
import copy
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can this be removed now?

Copy link
Member
@raghavrv raghavrv left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

With that I'm +1 for merge once CIs pass...

sgd_linear_clf.partial_fit(
X[:half_index], y[:half_index, i], classes=classes[i])
sgd_linear_clf.partial_fit(X[half_index:], y[half_index:, i])
assert_array_equal(sgd_linear_clf.predict(X), predictions[:, i])
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you try and put both these blocks in a loop? It'd save us some redundancy in code.

(sorry for the nagging)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sure thing :)

@tguillemot
Copy link
Contributor

@yupbank If you have added new partial_fit methods can you update doc/modules/scaling_strategies.rst as done in #8152 please ?

@yupbank
Copy link
Contributor Author
yupbank commented Jan 5, 2017

either sklearn.multioutput.MultiOutputClassifier or sklearn.multioutput.MultiOutputRegressor does not guarantee partial_fit would work, it depends on the estimator being passed.

so i'm not sure it is the right thing to do to add that in online learners

@tguillemot
Copy link
Contributor

@yupbank Sorry indeed, forget what I said :).

Copy link
Member
@raghavrv raghavrv left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot... I'm merging!

@raghavrv raghavrv merged commit 8695ff5 into scikit-learn:master Jan 5, 2017
@tguillemot
Copy link
Contributor

Thx @yupbank

raghavrv pushed a commit to raghavrv/scikit-learn that referenced this pull request Jan 5, 2017
* add partial_fit to multioupt module

* fix range in python3

* fix flake8

* fix the comments

* fix according to comments

* fix lint

* remove pytest

* fix ValueException message

* py 3.5 compatiable classes

* fix stuff

* fix according the comments

* remove used copy

* flake8..

* fix docs

* eventually, i use deepcopy to ensure the parallel

* lint..

* address final comment

* fix addressing the comments

* update confirmed separate estimators

* finally remove copy

* compact test
sergeyf pushed a commit to sergeyf/scikit-learn that referenced this pull request Feb 28, 2017
* add partial_fit to multioupt module

* fix range in python3

* fix flake8

* fix the comments

* fix according to comments

* fix lint

* remove pytest

* fix ValueException message

* py 3.5 compatiable classes

* fix stuff

* fix according the comments

* remove used copy

* flake8..

* fix docs

* eventually, i use deepcopy to ensure the parallel

* lint..

* address final comment

* fix addressing the comments

* update confirmed separate estimators

* finally remove copy

* compact test
@Przemo10 Przemo10 mentioned this pull request Mar 17, 2017
Sundrique pushed a commit to Sundrique/scikit-learn that referenced this pull request Jun 14, 2017
* add partial_fit to multioupt module

* fix range in python3

* fix flake8

* fix the comments

* fix according to comments

* fix lint

* remove pytest

* fix ValueException message

* py 3.5 compatiable classes

* fix stuff

* fix according the comments

* remove used copy

* flake8..

* fix docs

* eventually, i use deepcopy to ensure the parallel

* lint..

* address final comment

* fix addressing the comments

* update confirmed separate estimators

* finally remove copy

* compact test
NelleV pushed a commit to NelleV/scikit-learn that referenced this pull request Aug 11, 2017
* add partial_fit to multioupt module

* fix range in python3

* fix flake8

* fix the comments

* fix according to comments

* fix lint

* remove pytest

* fix ValueException message

* py 3.5 compatiable classes

* fix stuff

* fix according the comments

* remove used copy

* flake8..

* fix docs

* eventually, i use deepcopy to ensure the parallel

* lint..

* address final comment

* fix addressing the comments

* update confirmed separate estimators

* finally remove copy

* compact test
paulha pushed a commit to paulha/scikit-learn that referenced this pull request Aug 19, 2017
* add partial_fit to multioupt module

* fix range in python3

* fix flake8

* fix the comments

* fix according to comments

* fix lint

* remove pytest

* fix ValueException message

* py 3.5 compatiable classes

* fix stuff

* fix according the comments

* remove used copy

* flake8..

* fix docs

* eventually, i use deepcopy to ensure the parallel

* lint..

* address final comment

* fix addressing the comments

* update confirmed separate estimators

* finally remove copy

* compact test
maskani-moh pushed a commit to maskani-moh/scikit-learn that referenced this pull request Nov 15, 2017
* add partial_fit to multioupt module

* fix range in python3

* fix flake8

* fix the comments

* fix according to comments

* fix lint

* remove pytest

* fix ValueException message

* py 3.5 compatiable classes

* fix stuff

* fix according the comments

* remove used copy

* flake8..

* fix docs

* eventually, i use deepcopy to ensure the parallel

* lint..

* address final comment

* fix addressing the comments

* update confirmed separate estimators

* finally remove copy

* compact test
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

partial_fit for multioutput classifier/regressor
6 participants
0