Description
We have a pretty big project using js-data, and our unit test suit is closing in on 2000 tests now and we are having big issues being able to run them all. The issue seems to come from DS as removing it removes the performance issues.
I tried recreated the problems in js-data tests with the following code added to one of the test files:
describe('DS#memory-leak', function() {
function test() {
for (var j = 0; j < 100; j++) {
store.defineResource({
name: 'resource' + j
});
}
}
for (var i = 0; i < 1000; i++) {
it('Test number ' + i, test);
}
});
This code just registers 100 resources on the store for each test, and runs 1000 tests. When inspecting the memory usage of phantom one can see that it just keeps climbing indefinitely until phantom crashes because it can't allocate more memory.
Take a heap dump after a lot of tests are executed in our application and we see there are hundres of DS objects and thousands of Resource objects allocated that are never GCed. But this does not seem to happen when just running this test in js-data, there are only a few DS objects And a 100 or so Resources when dumping the heap. But it still runs out of memory without me being able to understand exactly why