8000 RFC: Support for running testsuite on minimal ports · Issue #2946 · micropython/micropython · GitHub
[go: up one dir, main page]

Skip to content
8000

RFC: Support for running testsuite on minimal ports #2946

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
pfalcon opened this issue Mar 10, 2017 · 8 comments
Closed

RFC: Support for running testsuite on minimal ports #2946

pfalcon opened this issue Mar 10, 2017 · 8 comments

Comments

@pfalcon
Copy link
Contributor
pfalcon commented Mar 10, 2017

I'm for quite some time working on making the testsuite pass (with skips) on Zephyr port, which currently has quite a minimal configuration. It's now down to 9 failing tests:

9 tests failed: vfs_fat_fileio const const2 const_error heapalloc heapalloc_inst_call heapalloc_int_from_bytes heapalloc_iter opt_level

vfs_fat_fileio fails because it simply cannot be parsed in 16KB heap. I don't see a (reliable?) way to test if compile-time const is supported, are there ideas? opt_level fails because line numbers aren't enabled. The weirdest case are heapalloc_* failures. Note that some heapalloc_* tests pass, but these fail. It might look as if heap is locked, but then needed by parser/compiler, but then it doesn't explain why other heapalloc_* tests work.

At the current stage, I'm keen just to create a special --target for run-tests and skip these tests for it. We apparently should stop calling these targets by hardware platform and give them just generic "profile" names. In this case, "minimal" sounds natural, though there may be still confusion, and minimal port certainly won't pass it still.

@dpgeorge
Copy link
Member

I don't see a (reliable?) way to test if compile-time const is supported, are there ideas?

If it doesn't pass then surely there is a programatic reason that can be used to check for const? Eg, the hasattr line in the const.py test might be used. Ideally one would check for the const feature using a feature-check script, then skip them if that check fails.

The weirdest case are heapalloc_* failures.

I tried but couldn't get zephyr port running (it built but qemu_x86 just sat there doing nothing; using the minimal build). These heapalloc tests should definitely pass on minimal builds so it needs looking into, perhaps using unix minimal build.. can you reproduce the errors there?

We apparently should stop calling these targets by hardware platform and give them just generic "profile" names.

But you still need to distinguish between a target run on the host vs one run remotely (eg over serail using pyboard.py). Eg there can be minimal host target and minimal remote target.

@pfalcon
Copy link
Contributor Author
pfalcon commented Mar 12, 2017

If it doesn't pass then surely there is a programatic reason that can be used to check for const?

Well, that's why I opened this ticket - to ask for help with if and how to do that. An obvious test for a constant would be - create one, and assign to one. That should fail. Does it in MicroPython? Regardless of yes or no, will it stay that way tomorrow, in a month, in a year? I don't remember the exact state of const features now, but of course know that we keep updating its featureset.

Eg, the hasattr line in the const.py test might be used.

Hmm, I wouldn't think so myself. That test appears to check for a particular sub-behavior of const, not for availability of const support.

Ideally one would check for the const feature using a feature-check script, then skip them if that check fails.

Yep, that's how I'd do it, and factored out like that, even a dirty hack would work (and can be fixed once when it breaks), so please confirm that you suggest to do it like above, and I'll do that. My concern is that hackishness of test setup framework grows, e.g. in 854bb32, a change to int_big.py.exp is void, because we don't really check .exp output for feature_check tests, and instead I need to hardcode expected output of such test directly into run-tests.

@pfalcon
Copy link
Contributor Author
pfalcon commented Mar 12, 2017

I tried but couldn't get zephyr port running (it built but qemu_x86 just sat there doing nothing; using the minimal build).

Thanks for the report, let's move that to #2481 (comment)

@dpgeorge
Copy link
Member

The weirdest case are heapalloc_* failures.

It was because the long-long int implementation allocated memory to print. Fixed by d1ae6ae. Those heapalloc tests should now all pass.

@dpgeorge
Copy link
Member

so please confirm that you suggest to do it like above,

Yes, please make a feature check for const. Probably best to use exec('x=const("A")').

@pfalcon
Copy link
Contributor Author
pfalcon commented Mar 14, 2017

It was because the long-long int implementation allocated memory to print. Fixed by d1ae6ae.

Great, thanks, I'm still in backlog looking back into this stuff.

@pfalcon
Copy link
Contributor Author
pfalcon commented Apr 2, 2017

Probably best to use exec('x=const("A")').

Looking at it today, I missed the idea of this expression. Used a simpler expression: b099aeb , which differentiates zephyr port vs unix port. Please check if something's wrong there.

@pfalcon
Copy link
Contributor Author
pfalcon commented Apr 3, 2017

Ok, the final step was introducing a generic "minimal" target for run-tests: 831e157 . It's roughly based on esp8266 target, but additionally skips opt_level.py test (actually, as can be seen from patch, it adhocly skips less than esp8266, only rge_sm.py additionally).

With this, I can finally run a testsuite against a Zephyr port device (316 tests run, 248 skipped).

Closing.

@pfalcon pfalcon closed this as completed Apr 3, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
3BAC Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants
0