-
Notifications
You must be signed in to change notification settings - Fork 2
make the viewer more responsive when selecting/first showing a big array #93
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintaine 8000 rs and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Making viewer (editor) more responsive is not the purpose of implementing a "Data Buffer" as suggested in #33 ? |
The buffer for the visible data is indeed a part of the picture. But here, before we visualize anything we work on the whole array to compute the min and max values, which for largish arrays is too slow. And I am unsure computing the min/max only on the current buffer is a good enough solution. Having a global min/max that is at first comput 8000 ed on the buffer and "enlarged" as necessary when we get more data might be a good solution AND/OR do like we do for ndigits/scientific: compute it on a sample (pick a few hundred values from all over the array to minimize bias for "increasing" arrays) |
FWIW, here the slowdown I speak about is the time it takes between selecting the array in the list and the data being displayed. This issue is not about scrolling speed like #38. |
… (issue larray-project#93) (so that we compute & redraw the new visible data only once)
…ox for axes >= 10000 elements (issue larray-project#93)
…es per change (issue larray-project#93)
…t#93) * one of the goal was to make switching from one array to another as fast as possible by cutting down on the repeated calls (to various set_data and datamodel.reset()) * tightened what is accepted by each internal class. There is only one expected type for the data. External facing classes should still accept the same objects. * all internal classes which "hold" data are created without any data in __init__ but require a set_data before they can function. * data_model.reset_minmax needs to be called explicitly. * data_model.reset() needs to be called explicitly when "everything is done"
on the whole array but on a sample (fixes larray-project#93) This means we can miss the actual min/max of the displayed part, so it complexifies the code. Most of this awful code will go away after : * we invert how changes work (store old values instead of new values) * we get decent buffering (in that case the min/max should only be updated when "moving" the buffer.
…t#93) * one of the goal was to make switching from one array to another as fast as possible by cutting down on the repeated calls (to various set_data and datamodel.reset()) * tightened what is accepted by each internal class. There is only one expected type for the data. External facing classes should still accept the same objects. * all internal classes which "hold" data are created without any data in __init__ but require a set_data before they can function. * data_model.reset_minmax needs to be called explicitly. * data_model.reset() needs to be called explicitly when "everything is done"
on the whole array but on a sample (fixes larray-project#93) This means we can miss the actual min/max of the displayed part, so it complexifies the code. Most of this awful code will go away after : * we invert how changes work (store old values instead of new values) * we get decent buffering (in that case the min/max should only be updated when "moving" the buffer.
… (issue #93) (so that we compute & redraw the new visible data only once)
…ox for axes >= 10000 elements (issue #93)
* one of the goal was to make switching from one array to another as fast as possible by cutting down on the repeated calls (to various set_data and datamodel.reset()) * tightened what is accepted by each internal class. There is only one expected type for the data. External facing classes should still accept the same objects. * all internal classes which "hold" data are created without any data in __init__ but require a set_data before they can function. * data_model.reset_minmax needs to be called explicitly. * data_model.reset() needs to be called explicitly when "everything is done"
… (issue larray-project#93) (so that we compute & redraw the new visible data only once)
…ox for axes >= 10000 elements (issue larray-project#93)
…es per change (issue larray-project#93)
…t#93) * one of the goal was to make switching from one array to another as fast as possible by cutting down on the repeated calls (to various set_data and datamodel.reset()) * tightened what is accepted by each internal class. There is only one expected type for the data. External facing classes should still accept the same objects. * all internal classes which "hold" data are created without any data in __init__ but require a set_data before they can function. * data_model.reset_minmax needs to be called explicitly. * data_model.reset() needs to be called explicitly when "everything is done"
on the whole array but on a sample (fixes larray-project#93) This means we can miss the actual min/max of the displayed part, so it complexifies the code. Most of this awful code will go away after : * we invert how changes work (store old values instead of new values) * we get decent buffering (in that case the min/max should only be updated when "moving" the buffer.
Currently it slows down to a crawl when working with big arrays.
The vmin/vmax computation could be the culprit. Being fast is more important than having perfect colors, so IF that is indeed the cause of the slowdown, we should use a sample, like for computing ndigits. But then the color code needs to be adapted to cope with having colorval > vmax OR, preferably updating vmin/vmax as we go, when loading more data.
The text was updated successfully, but these errors were encountered: