0

I have a table cart:

 id     | value |     metadata
--------+-------+-------------------
  45417 |     0 | {"value": "1300"}
  45418 |     0 | {"value": "1300"}
 276021 |     0 | {"value": "1300"}

and I'm trying to UPDATE the value column with the value in the JSONB metadata if it exists. I come up with the following query:

UPDATE cart SET value=CAST(subquery.meta_val as INTEGER) FROM
(SELECT id, metadata->>'value' as meta_val FROM cart
WHERE value = 0 AND 
metadata->>'value' IS NOT NULL) as subquery
WHERE cart.id=subquery.id;

Now this works but it takes quite a lot of time for 4M rows I want to update on production and it looks to me like there is a lot of redundancy in the query.

I think the next step would be to wrap all this in a transaction and improve the query, is there anything that can be done to improve performance out of this query ?

1 Answer 1

1

Try it without a subquery.

update cart as c
set value = coalesce((c.metadata->>'value')::int, 0)
Sign up to request clarification or add additional context in comments.

1 Comment

Or if you want to leave the value unperturbed, then instead of 0, just use c.value

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.