8000 Perplexity not monotonically decreasing for batch Latent Dirichlet Allocation · Issue #6777 · scikit-learn/scikit-learn · GitHub
[go: up one dir, main page]

Skip to content
Perplexity not monotonically decreasing for batch Latent Dirichlet Allocation #6777
Open
@amueller

Description

@amueller

When using the batch method, the perplexity in LDA should be non-increasing in every iteration, right?
I have cases where it does increase. If this is indeed a bug, I'll investigate.

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      0