You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm very interested in this extension, but the sliders seem to be unorganized. Rather than only getting the min and max value of every embedding, perhaps using torch.pca_lowrank() or what have you in order to arrange the features of the embeddings will allow for a more natural modification of tokens?
However, there's a big issue with this: There's a lot of tokens that people will likely have no interest in. So instead of a PCA of every embedding, the ability to choose a handful of tokens that you're interested in / have similar qualities, run PCA on that, and modify the sliders to act within this via matrix multiplication with the matrix it generates, and an inverse matrix to transform the weights back into the original space. The number of sliders will end up changing to the minimum of the number of features and the number of embeddings, I don't know if that'll be an issue for the UI. The torch.pca_lowrank() q parameter will need to be set manually or else it'll cap out at 6 by default, only giving 6 sliders.
The text was updated successfully, but these errors were encountered:
I'm very interested in this extension, but the sliders seem to be unorganized. Rather than only getting the min and max value of every embedding, perhaps using torch.pca_lowrank() or what have you in order to arrange the features of the embeddings will allow for a more natural modification of tokens?
However, there's a big issue with this: There's a lot of tokens that people will likely have no interest in. So instead of a PCA of every embedding, the ability to choose a handful of tokens that you're interested in / have similar qualities, run PCA on that, and modify the sliders to act within this via matrix multiplication with the matrix it generates, and an inverse matrix to transform the weights back into the original space. The number of sliders will end up changing to the minimum of the number of features and the number of embeddings, I don't know if that'll be an issue for the UI. The torch.pca_lowrank() q parameter will need to be set manually or else it'll cap out at 6 by default, only giving 6 sliders.
The text was updated successfully, but these errors were encountered: