1

So basically I'm trying to find a memory-efficient way to copy an array. Suppose we have arr1 which contains 500000 elements of value true:

var arr1 = [];

for (var i = 0; i < 500000; i++) {
    arr1[i] = true;
}

In Node.js, that occupies about 32.3 MiB (including 18.7 MiB when Node starts). Now, obviously when you make a reference to arr1, no memory will be allocated:

var arr2 = arr1;

Now, when I perform a copy of arr1 into arr2:

var arr2 = arr1.concat();

The process occupies 36.2 MiB, so about 4 MiB.

Here's the thing: no matter what I do to empty or wipe the original array, the memory allocated to that array won't get freed or picked up by the garbage collector. Suppose I have:

arr1.length = 0;
delete arr1;
arr1 = undefined;

arr1 = arr2.concat();

Thanks to that, the process now occupies 39.8 MiB.

So what is really happening here? Is there some secret reference to the original array that Node (or whatever JS engine out there) is trying to hide from me? Here's further code:

arr2.length = 0;
delete arr2;
arr2 = undefined;

arr2 = arr1.concat();

Which will simply "empty" arr2 so it can hold a copy of arr1. As you may have figured out, I'm attemping to transfer the array's contents back and forth, but now the process occupies 43.5 MiB. If this was a large array, memory intake would be huge. Is there a way to do this, taking memory efficienty into account?

8
  • 1
    Just curious, how are you profiling the memory usage? Commented May 31, 2015 at 23:15
  • 2
    Also how long after you discard the original array are you checking the memory usage? The GC may not collect it right away. Commented May 31, 2015 at 23:16
  • I'm simply using xfce4-taskmanager to check the memory usage. I've waited about 10 minutes with no change in the usage at all. Commented May 31, 2015 at 23:21
  • Have you tried forcing the garbage collection to occur? Maybe this article will help you out with that. Commented May 31, 2015 at 23:23
  • @CathedralZealot Are you testing this in the browser console? If so, which browser. Commented May 31, 2015 at 23:25

1 Answer 1

1

Your profiling technique is not correct.

I created an array the same way you do and created "a clone" with the same .concat() method as you do, and here are the results

enter image description here

So as you can see it's the same array (that takes just ~2.06Mb) retained by 2 references.

The corresponding jsfiddle: http://jsfiddle.net/6o0h0r1j/

Relevant reading:

To summarize: your assumptions are wrong from the very beginning.

Sign up to request clarification or add additional context in comments.

4 Comments

Can we then assume that my issue does not apply in a browser environment?
@CathedralZealot I hardly doubt an array of 500k booleans would take THAT MUCH space even in an ancient version of V8. It would be helpful if you explained how exactly you profiled the heap contents.
@lispHK01 I honestly have no idea - I only profile in google chrome
node-inspector exposes the same profiling tools for node.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.