-
-
Notifications
You must be signed in to change notification settings - Fork 25.9k
ValueError: Buffer dtype mismatch, expected 'int' but got 'long' #13526
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Please provide a complete, runnable example that reproduces your issue. It's very clear that the code you provided has little to do with the output provided. |
import numpy as np
import scipy.sparse
from sklearn.linear_model import LogisticRegression
size=222
data = np.random.uniform(size=size)
row = np.random.randint(low=0, high=28034374, size=size, dtype=int)
col = np.random.randint(low=0, high=904, size=size, dtype=int)
inputs = scipy.sparse.csr_matrix((data, (row, col)), shape=(28034374,904))
inputs.indptr = inputs.indptr.astype(int)
inputs.indices = inputs.indices.astype(int)
targets = np.random.randint(low=0, high=2, size=[28034374])
model = LogisticRegression(
C=1,
solver='sag',
random_state=0,
tol=0.0001,
max_iter=100,
verbose=1,
warm_start=True,
n_jobs=64,
penalty='l2',
dual=False,
multi_class='ovr',
)
model.fit(inputs, targets) In this example case I had to directly cast the indices to I tried to cast my real data indices to |
(Also please feel free to comment here or in the above-linked issue to continue the discussion). |
Hi, I also ran into this issue. Is there any workaround for this? Thanks! |
@yujianll you could go for:
|
@Hoeze Thanks! |
I'm trying to fit a logistic regression on a sparse matrix, but I'm failing due to some ValueError:
Is there some way I can still train my data?
Originally posted by @Hoeze in #10758 (comment)
The text was updated successfully, but these errors were encountered: