1

At the moment I try to build a data import in Angular. All works fine, but my Backend only allows arround 70 requests. So it runs into an error. My thinking is to put a timeout in angular for Loop with post request. Is this right way to handle this problem? And if yes, how to do this? Can`t find any solution. This is my request at the moment:

for( let x of this.data ){
    this.http.post(url,x,{ headers: (reqHeader)}).toPromise().then((data: any) => { console.warn(this.data) })
}
2
  • You basically need to retryWhen: stackoverflow.com/a/44979389/2490286 Commented Sep 16, 2020 at 7:19
  • Do you mean a total of 70 requests or 70 parallel requests at a time? Commented Sep 16, 2020 at 7:27

2 Answers 2

1

Maybe rate limiting is what you are looking for?

... Then we'll chain it with bufferTime and concatMap. The concatMap operator is where we force the 1000ms delay:

const startTime = (new Date()).getTime();

const source = Observable.range(1, 25)
  .concatMap(val => Observable.of(val).delay(75));

source.bufferTime(1000, null, 5)
  .concatMap(buffer => Observable.of(buffer).delay(1000))
  .timestamp()
  .map(obj => {
    obj.timestamp = obj.timestamp - startTime;
    return obj;
  })
  .subscribe(obj => console.log(obj));
Sign up to request clarification or add additional context in comments.

1 Comment

Thanks a lot, seems its what I need, but cant figure out how to use with my code. Have to put it into for loop? How to import Observable.range? Shows me not exists. Thanks a lot.
0

Not sure what you exactly mean by Backend only allows arround 70 requests. But if you wish to control the number of parallel requests at a time, you could use RxJS from, bufferCount and forkJoin functions with concatMap operator.

  1. Try to avoid toPromise(). It is an easier way to switch back to a familiar Promise paradigm, but it's being deprecated in RxJS 7 and will be gone in RxJS 8. Instead try to subscribe to the observables.

  2. For controlled parallel requests, try the following

const reqs = this.data.map(item => this.http.post(url, item, { headers: (reqHeader) }))

from(reqs).pipe(
  bufferCount(6),      // <-- adjust number of parallel requests here
  concatMap(buffer => forkJoin(buffer))
).subscribe(
  res => console.warn(res),
  err => console.log(err),
  () => console.log('complete')
);

Update: delay each request

You could forgo the bufferCount and instead switch to each request individually and use an explicit delay. Try the following

from(this.data).pipe(
  concatMap(item => this.http.post(url, item, { headers: (reqHeader) }).pipe(
    delay(3000)     // <-- wait 3 seconds b/n each request
  ))
).subscribe(
  res => { },
  err => { }
);

Update: count the emissions

You could introduce a variable (eg. count) and use the map operator to return both the count and the response.

Try the following

someFunc() {
  let count = 0;

  from(this.data).pipe(
    concatMap(item => this.http.post(url, item, { headers: (reqHeader) }).pipe(
      map(res => {
        count++;
        return {
          count: count,
          response: res
        }
      }),
      delay(3000)     // <-- wait 3 seconds b/n each request
    ))
  ).subscribe(
    res => {
      console.log(res.count);        // <-- the count of the emission
      console.log(res.response);     // <-- the response from `this.http.post()`
    },
    err => { }
  );
}

10 Comments

I turned bufferCount(1) seams to run nice, but after arround 60 requests: headers: {…}, status: 429, statusText: "Too Many Requests"
Does your response have a Retry-After header? How long is the wait time?
The wait time is about some seconds. Is there no way to put a delay inside ?
@SaschaK: I've updated the answer to have an explicit wait time b/n each request using the delay operator. Please check if it works for you.
< You are the hero. Thank you so much. With from code looks really nice and runs perfect. And not to forget I learned something more in Angular.
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.