You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If for alpha = 1, the algorithm performs hard clamping, I was expecting the transduction labels of the labelled data to be unchanged and equal to the initial labels, therefore the printed fraction should be 1. Any deviation from 1 indicates something is wrong.
Actual Results
Actually, I am getting 0.985294117647, indicating some initial labels changed during transduction.
Versions
sklearn version: '0.18.1'
numpy version: '1.11.2'
The text was updated successfully, but these errors were encountered:
Description
According to documentation in http://scikit-learn.org/stable/modules/label_propagation.html, when the alpha parameter, clamping factor, is 1, the algorithm performs hard clamping, which does not allow the labels to change cluster. However this is not the case in the following case. I copied and slightly changed the example found in http://scikit-learn.org/stable/modules/generated/sklearn.semi_supervised.LabelPropagation.html#sklearn.semi_supervised.LabelPropagation
Steps/Code to Reproduce
from sklearn import datasets
from sklearn.semi_supervised import LabelPropagation
import numpy as np
np.random.seed(1) # fix random seed
label_prop_model = LabelPropagation()
iris = datasets.load_iris()
inds = np.random.randint(0, 2,size=len(iris.target))
random_unlabeled_points = np.where(inds)
random_labeleled_points = np.where(inds == 0)
labels = np.copy(iris.target)
labels[random_unlabeled_points] = -1
label_prop_model.fit(iris.data, labels)
ypred = label_prop_model.transduction_
fracunchanged = sum(iris.target[random_labeleled_points] == ypred[random_labeleled_points])/len(iris.target[random_labeleled_points])
print(fracunchanged)
Expected Results
If for alpha = 1, the algorithm performs hard clamping, I was expecting the transduction labels of the labelled data to be unchanged and equal to the initial labels, therefore the printed fraction should be 1. Any deviation from 1 indicates something is wrong.
Actual Results
Actually, I am getting 0.985294117647, indicating some initial labels changed during transduction.
Versions
sklearn version: '0.18.1'
numpy version: '1.11.2'
The text was updated successfully, but these errors were encountered: