1

My PHP application is decoding about 100 json objects each request, I have many requests per secound, and I need to optimize my code. I was wondering if it would be faster to decode one huge json object instead of 100 smaller ones (assuming this big json was made by nesting the small ones). Anyone had similar issue ?

3
  • $data = json_decode($data, true); Commented Oct 18, 2012 at 13:18
  • Those ~100 json literals are sent with the request and there's no way around that? Commented Oct 18, 2012 at 13:40
  • with each request I need to fetch about 100 json encoded records from Redis Commented Oct 18, 2012 at 13:51

1 Answer 1

1

You will nearly have the same performance with your two methods.

Check your requests time execution and response time with Firebug or similar.

After your test, it seems that your big json is slower than multiple mini json.

Sign up to request clarification or add additional context in comments.

2 Comments

Quite interesting, I don't have you code source, maybe your merge is a huge operation.
I don't measure merge operation, I start counting time after merge is done

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.