2

I have a large (~10,000 items) object of objects read from a JSON file and stored as a local variable, in a format something like this:

{
    "some_uuid_1" : {"code":"some_code_1", "name":"asdf"},
    "some_uuid_2" : {"code":"some_code_2", "name":"qwer"},
    ...
    "some_uuid_n" : {"code":"some_code_n", "name":"zxcv"}
}

I'd like to iterate through the entire main object, compare each element's name property to some variable checkName, and append the element's code property to a DOM element if there is a match, as such:

function myFilter(checkName)
{
    var myArray = Object.values(myObj);

    for (var i = 0; i < myArray.length; i++)
    {
        if (myArray[i]["name"] == checkName)
        {
            $("#someElement").append(`${myArray[i]["code"]} <br />`);
        }
    }
}

However, as the size of the object is quite large, I'd like to run the function asynchronously so that it doesn't freeze up the browser while it's running. I don't mind if the DOM element #someElement is slowly populated in the background while other code runs.

How may I accomplish this using JavaScript and/or JQuery?

4
  • Why don't you create some variable htmlContent, gather everything in it as a string and access DOM once, instead of 10k times? Commented Jun 9, 2018 at 12:47
  • In my particular use case, there will only be about 3-4 matches expected, so accessing the DOM multiple times isn't really a problem as much as the iteration itself. But I definitely agree that your suggestion has value if there are far more matches expected. Commented Jun 9, 2018 at 12:52
  • See stackoverflow.com/q/6864397/1048572, stackoverflow.com/q/714942/1048572 Commented Jun 9, 2018 at 13:52
  • @theDupeVoteCaster good catch, this is it. Commented Jun 9, 2018 at 14:06

3 Answers 3

3

A small helper might help here:

function asyncForEach(arr, cb, done) {
  (function next(i) {
    if(i >= arr.length) {
       if(done) done();
       return;
    }
    cb(arr[i], i, arr);
    setTimeout(next, 0, i + 1); // a small trick to defer actions
  })(0);
}

Or to optimize it you could chunk the results and only yield every 1000 iterations or so:

function asyncForEach(arr, cb, done) {
  (function next(i) {
    if(i >= arr.length) {
       if(done) done();
       return;
    }
    let stop = i + 1000;
    setTimeout(next, 0, stop); // a small trick to defer actions
    while(i < arr.length && i < stop)
      cb(arr[i], i++, arr);
  })(0);
}

Which can be used like this in your case:

asyncForEach(myArray, function(el) {
  if (el.name === checkName){
    $("#someElement").append(`${el.code} <br />`);
  }
});

However probably the slowest part here is appending to the dom. If you don't want to have "live progress" its probably good to batch the dom update to one single call:

let result = "";
asyncForEach(myArray, function(el) {
  if (el.name === checkName){
    result += `${el.code} <br />`;
  }
}, function() {
  $("#someElement").append(result);
});

And then even the synchrobous variant might be fast enough:

let result = "";
for(const el of myArray) {
  if(el.name === checkName)
    result += `${el.code} <br />`;
}
$("#someElement").append(result);
Sign up to request clarification or add additional context in comments.

2 Comments

Thanks for your reply. However, when I ran the code on a test dataset, I noticed that the results for both asynchronous versions take far longer to appear than the synchronous version (and even the original code that I posted). Why would this be the case, and won't setTimeout() slow down the code by acting similarly to sleep() in other languages?
@user2035362 no, setTimeout does not halt the javascript thread, which most other languages do with sleep. And yes setTimeout slows everything down, which is madatory so that the thread can take its time to repaint. To optimize you might want to chunk the array, and execute 1000 results or so synchronously and then setTimeout one tick before executig the next 1000
2

Even 10k items shouldn't be too intensive for modern browsers. The true time sink is the append which is causing the browser to reflow/repaint every iteration of the loop that matches the name.

This can be avoided by combining all of the matches into one string and then appending that string once.

const codes = Object.values(myObj).filter(({name}) => name === checkName).map({code} => code).join('<br/>');

$('#someElement').append(codes);

2 Comments

Yes, I noticed that the synchronous code executes virtually instantaneously when testing on my own computer, but I'm concerned that weaker client devices might not perform as well, and that the sub-objects in the actual production dataset will also contain far more values than just a simple name and code). At what size of the dataset does performance start to become an issue on an average computer / mobile device?
According to OPs previous comment he is only expecting 3-4 matches. I doubt that append is the culprit.
0

The one of solutions is to execute your code with workers.

A worker is an object created using a constructor (e.g. Worker()) that runs a named JavaScript file — this file contains the code that will run in the worker thread; workers run in another global context that is different from the current window. Thus, using the window shortcut to get the current global scope (instead of self) within a Worker will return an error.

So, you'll not use the main thread

Here is the reference link

7 Comments

Workers can't access the DOM.
Please limit answers to those that actually answer the question. In this case, there is too much unknown code between "workers" and a solution that this text would better off a comment than an answer.
They can't, but it's possible to do loop and the comparison in it
Agreed with @Brissy. Of this 10k elements, what's the average after the filtering function? 10? 100? 1000? 5000? It might be worth using workers to do the main loop freeing the UI, and then posting the results to the main thread in order to add them. The main thread can also add them asynchronously. Basically using both approaches. I wouldn't discard the workers for the iteration.
Yeah but this is still far to broad. You could also solve this problem with a donut.
|

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.