8000 Missing LayerNormalization & MultiHeadAttention Keras Layers · Issue #797 · SciSharp/TensorFlow.NET · GitHub
[go: up one dir, main page]

Skip to content

Missing LayerNormalization & MultiHeadAttention Keras Layers #797

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
lqdev opened this issue Apr 13, 2021 · 2 comments
Open

Missing LayerNormalization & MultiHeadAttention Keras Layers #797

lqdev opened this issue Apr 13, 2021 · 2 comments
Assignees

Comments

@lqdev
Copy link
lqdev commented Apr 13, 2021

@Oceania2018 can you assign this issue to me. I'll try and take a stab at it.

I was trying to see whether implementing the Keras Transformer recommendation sample is feasible in .NET. However, I believe two of the layers, LayerNormalization and MultiHeadAttention are missing from the current implementation.

@GeorgeS2019
Copy link
GeorgeS2019 commented May 23, 2021

@lqdev In your approach to MultiHeadAttention, perhaps you could share your feedback to this thread of discussion.

Perhaps we may find more people interested on this topics for .NET

@lindadamama
Copy link

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants
0