1

I have .tsv file with some orders information. After remake into my script i got this.

[{"order":"5974842dfb458819244adbf7","name":"Сергей Климов","email":"[email protected]"},
{"order":"5974842dfb458819244adbf8","name":"Сушков А.В.","email":"[email protected]"},
{"order":"5974842dfb458819244adbf9","name":"Виталий","email":"[email protected]"},
...
and so on

I have a scheema into mongoose.

var ClientSchema = mongoose.Schema({
  name:{
    type: String
  },
  email:{
    type: String,
    unique : true,
    required: true,
    index: true
  },
  forums:{
    type: String
  },
  other:{
    type: String
  },
  status:{
    type: Number,
    default: 3
  },
  subscribed:{
    type: Boolean,
    default: true
  },
  clienturl:{
    type: String  
  },
  orders:{
    type: [String]
  }
});

clienturl is an password 8 chars length, that generated by function.

module.exports.arrayClientSave = function(clientsArray,callback){
  let newClientsArray = clientsArray
    .map(function(x) {
      var randomstring = Math.random().toString(36).slice(-8);
      x.clienturl = randomstring;
      return x;
    });
  console.log(newClientsArray);
  Client.update( ??? , callback );
}

But i dont undestand how to make an update. Just if email already exsists push orders array, but not rewrite all other fields. But if email not exsists - save new user with clienturl and so on. Thanks!

1
  • I didnt comment just cuz started learning about bulkWrite and mongo usage. Answer was like i need. Commented Jul 27, 2017 at 11:48

1 Answer 1

1

Probably the best way to handle this is via .bulkWrite() which is a MongoDB method for sending "multiple operations" in a "single" request with a "single" response. This counters the need to control async functions in issue and response for each "looped" item.

module.exports.arrayClientSave = function(clientsArray,callback){
  let newClientsArray = clientsArray
    .map(x => {
      var randomstring = Math.random().toString(36).slice(-8);
      x.clienturl = randomstring;
      return x;
    });
    console.log(newClientsArray);

    let ops = newClientsArray.map( x => (
      { "updateOne": {
        "filter": { "email": x.email },
        "update": {
          "$addToSet": { "orders": x.order },
          "$setOnInsert": {
            "name": x.name,
            "clientUrl": x.clienturl
          }
        },
        "upsert": true
      }}
    ));

    Client.bulkWrite(ops,callback);
};

The main idea there being that you use the "upsert" functionality of MongoDB to drive the "creation or update" functionality. Where the $addToSet only appends the "orders" property information to the array where not already present, and the $setOnInsert actually only takes effect when the action is actually an "upsert" and not applied when the action matches an existing document.

Also by applying this within .bulkWrite() this becomes a "single async call" when talking to a MongoDB server that supports it, and that being any version greater than or equal to MongoDB 2.6.

However the main point of the specific .bulkWrite() API, is that the API itself will "detect" if the server connected to actually supports "Bulk" operations. When it does not, this "downgrades" to individual "async" calls instead of one batch. But this is controlled by the "driver", and it will still interact with your code as if it were actually one request and response.

This means all the difficulty of dealing with the "async loop" is actually handled in the driver software itself. Being either negated by the supported method, or "emulated" in a way that makes it simple for your code to just use.

Sign up to request clarification or add additional context in comments.

2 Comments

Wow! Thats like i needed! Thanks!
Sorry im a bit new be =)

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.