8000 `tensorflow`: Add missing members to the `tensorflow.keras.layers` module. by hoel-bagard · Pull Request #11333 · python/typeshed · GitHub
[go: up one dir, main page]

Skip to content

tensorflow: Add missing members to the tensorflow.keras.layers module. #11333

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 27 commits into from
Mar 13, 2024
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
Show all changes
27 commits
Select commit Hold shift + click to select a range
ca822c2
Add some missing keras layers
hoel-bagard Jan 28, 2024
6e26f9b
Move some layers to keras.layers.preprocessing
hoel-bagard Jan 28, 2024
35056a7
fix: add MultiHeadAttention's __call__ to the stubtest allowlist
hoel-bagard Jan 28, 2024
6e83aa6
remove MultiHeadAttention...
hoel-bagard Jan 31, 2024
cd8632a
Revert "remove MultiHeadAttention..."
hoel-bagard Feb 1, 2024
755b0ab
type ignore the override
hoel-bagard Feb 1, 2024
d543957
Merge branch 'main' into hoel/add_tf_keras_layers
hoel-bagard Feb 4, 2024
9eda7f3
try to fix mypy crash.
hoel-bagard Feb 4, 2024
2c41a03
add modules to allowlist due to cursed imports.
hoel-bagard Feb 5, 2024
4bf4ad8
Merge branch 'main' into hoel/add_tf_keras_layers
hoel-bagard Feb 8, 2024
976b9c8
remove tensorflow.keras.layers.MultiHeadAttention.__call__ from allow…
hoel-bagard Feb 8, 2024
37f1b7c
Merge branch 'main' into hoel/add_tf_keras_layers
hoel-bagard Feb 17, 2024
a7739fb
test
hoel-bagard Feb 17, 2024
f5f74c2
Merge branch 'main' into hoel/add_tf_keras_layers
JelleZijlstra Feb 17, 2024
03c3e9b
Revert "test"
hoel-bagard Feb 17, 2024
aa05ac7
Merge branch 'main' into hoel/add_tf_keras_layers
JelleZijlstra Feb 17, 2024
7d0343f
fix: tuple -> Iterable
hoel-bagard Feb 17, 2024
37c668b
add PreprocessingLayer methods/overloads
hoel-bagard Feb 17, 2024
4228893
fix PreprocessingLayer typing
hoel-bagard Feb 17, 2024
31134a1
make IndexLookup private
hoel-bagard Feb 17, 2024
aaa3738
silence/ignore mypy error.
hoel-bagard Feb 17, 2024
ac3b558
fix: make PreprocessingLayer's is_adapted into a property.
hoel-bagard Feb 17, 2024
bc7926a
Merge branch 'main' into hoel/add_tf_keras_layers
JelleZijlstra Feb 17, 2024
05ce2ed
Merge branch 'main' into hoel/add_tf_keras_layers
rchen152 Feb 27, 2024
189e918
Merge branch 'main' into hoel/add_tf_keras_layers
hoel-bagard Mar 12, 2024
849279e
try to fix pytype issue
hoel-bagard Mar 13, 2024
3d1cfbc
merge with main
hoel-bagard Mar 13, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
remove MultiHeadAttention...
  • Loading branch information
hoel-bagard committed Jan 31, 2024
commit 6e83aa66f2797cd2359aadb9d5e2a55848bedd16
1 change: 0 additions & 1 deletion stubs/tensorflow/@tests/stubtest_allowlist.txt
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,6 @@ tensorflow.keras.layers.*.__init__
tensorflow.keras.layers.*.call
tensorflow.keras.regularizers.Regularizer.__call__
tensorflow.keras.constraints.Constraint.__call__
tensorflow.keras.layers.MultiHeadAttention.__call__

# Layer class does good deal of __new__ magic and actually returns one of two different internal
# types depending on tensorflow execution mode. This feels like implementation internal.
Expand Down
58 changes: 1 addition & 57 deletions stubs/tensorflow/tensorflow/keras/layers/__init__.pyi
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ from tensorflow.keras.activations import _Activation
from tensorflow.keras.constraints import Constraint
from tensorflow.keras.initializers import _Initializer
from tensorflow.keras.layers.preprocessing import IntegerLookup as IntegerLookup, StringLookup as StringLookup
from tensorflow.keras.regularizers import Regularizer, _Regularizer
from tensorflow.keras.regularizers import _Regularizer
from tensorflow.python.feature_column.feature_column_v2 import DenseColumn, SequenceDenseColumn

_InputT = TypeVar("_InputT", contravariant=True)
Expand Down Expand Up @@ -255,62 +255,6 @@ class DenseFeatures(Layer[Mapping[str, TensorLike], tf.Tensor]):
name: str | None = None,
) -> None: ...

class MultiHeadAttention(Layer[Any, tf.Tensor]):
def __init__(
self,
num_heads: int,
key_dim: int | None,
value_dim: int | None = None,
dropout: float = 0.0,
use_bias: bool = True,
output_shape: tuple[int, ...] | None = None,
attention_axes: tuple[int, ...] | None = None,
kernel_initialize: _Initializer = "glorot_uniform",
bias_initializer: _Initializer = "zeros",
kernel_regularizer: Regularizer | None = None,
bias_regularizer: _Regularizer | None = None,
activity_regularizer: _Regularizer | None = None,
kernel_constraint: _Constraint | None = None,
bias_constraint: _Constraint | None = None,
trainable: bool = True,
dtype: _LayerDtype | None = None,
dynamic: bool = False,
name: str | None = None,
) -> None: ...
@overload
def __call__(
self,
query: tf.Tensor,
value: tf.Tensor,
key: tf.Tensor | None,
attention_mask: tf.Tensor | None,
return_attention_scores: Literal[False],
training: bool,
use_causal_mask: bool,
) -> tf.Tensor: ...
@overload
def __call__(
self,
query: tf.Tensor,
value: tf.Tensor,
key: tf.Tensor | None,
attention_mask: tf.Tensor | None,
return_attention_scores: Literal[True],
training: bool,
use_causal_mask: bool,
) -> tuple[tf.Tensor, tf.Tensor]: ...
@overload
def __call__(
self,
query: tf.Tensor,
value: tf.Tensor,
key: tf.Tensor | None = None,
attention_mask: tf.Tensor | None = None,
return_attention_scores: bool = False,
training: bool = False,
use_causal_mask: bool = False,
) -> tuple[tf.Tensor, tf.Tensor] | tf.Tensor: ...

class GaussianDropout(Layer[tf.Tensor, tf.Tensor]):
def __init__(
self,
Expand Down
0