-
Notifications
You must be signed in to change notification settings - Fork 3.5k
Fix test_ops for tiny backend #9302
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
This branch currently is behind tinygrad/master. The line count difference bot is disabled. |
bbf0e83
to
9082525
Compare
locked to you since if you have done more than half of these also it needs to work correctly in principle, not just hacking to make the test pass |
extra/torch_backend/backend.py
Outdated
out = Tensor.avg_pool2d(self, kernel_size, stride, dilation=1, padding=padding, ceil_mode=ceil_mode, count_include_pad=count_include_pad) | ||
return wrap(out.gradient(self, gradient=grad_out)[0]) | ||
|
||
@torch.library.impl("aten::replication_pad1d_backward", "privateuseone") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Don't copy paste!
feels like there something subtly wrong. if i add contiguous and make it |
Yeah, if I make the expand contiguous instead of |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
cool, the realize
and contiguous
need to be understood though
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure what it will take to get it right, but not figuring it out just makes iteration more difficult later. There's clearly a bug somewhere
if you don't think you will be able to fix it, feel free to asking around on discord and you can figure out how to split the bounty
Spent some time debugging the test_pad_circular_mode failure. Some observations:
|
fine to merge other non-controversial changes so that this does not go stale |
congrats! bounty is yours. Can pay via PayPal or USDC on ETH, e-mail george@tinygrad.org to claim. |
With
LLVM=1 LLVMOPT=0 TINY_BACKEND=1 python3 -m pytest -n auto test/test_ops.py
Test status on
master
(149 failures):Test status on this PR (All passing):