This repository was archived by the owner on Nov 17, 2023. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 6.7k
This repository was archived by the owner on Nov 17, 2023. It is now read-only.
Dropout inconsistency bug #16705
Copy link
Copy link
Open
Labels
Description
In the following script, we should obtain the same dropout mask but currently the result is related to nrepeat. Note that I've turned off cudnn dropout by setting cudnn_off=True.
import mxnet as mx
import numpy as np
import random
from numpy.testing import assert_allclose
base_y_np = None
for nrepeat in [1, 2, 3, 4]:
seed = 123
mx.random.seed(seed)
np.random.seed(seed)
random.seed(seed)
x = mx.nd.ones((3, 3), ctx=mx.gpu())
for _ in range(nrepeat):
y = mx.nd.Dropout(x, cudnn_off=True)
with mx.autograd.record():
y = mx.nd.Dropout(x, cudnn_off=True)
y_np = y.asnumpy()
if base_y_np is None:
base_y_np = y_np
else:
assert_allclose(base_y_np, y_np)Output:
Not equal to tolerance rtol=1e-07, atol=0
Mismatch: 55.6%
Max absolute difference: 2.
Max relative difference: 1.
x: array([[0., 2., 0.],
[0., 0., 2.],
[0., 2., 0.]], dtype=float32)
y: array([[2., 2., 0.],
[0., 2., 0.],
[0., 0., 2.]], dtype=float32)
If we set the nrepeat to be the same value, the result is consistent
import mxnet as mx
import numpy as np
import random
from numpy.testing import assert_allclose
base_y_np = None
ctx = mx.gpu()
for nrepeat in [3, 3, 3, 3]:
seed = 123
mx.random.seed(seed)
np.random.seed(seed)
random.seed(seed)
x = mx.nd.ones((3, 3), ctx=ctx)
for _ in range(nrepeat):
y = mx.nd.Dropout(x, cudnn_off=True)
with mx.autograd.record():
y = mx.nd.Dropout(x, cudnn_off=True)
y_np = y.asnumpy()
if base_y_np is None:
base_y_np = y_np
else:
assert_allclose(base_y_np, y_np)Reactions are currently unavailable