1

I have multiple JSON files, they all have same format but the values are different based on each transaction. I want to migrate this data to a postgresql table. What is the best way to proceed with this?

Right now, I am using the following query:

CREATE TABLE TEST (MULTIPROCESS VARCHAR(20), HTTP_REFERER VARCHAR(50));
INSERT INTO TEST SELECT MULTIPROCESS, HTTP_REFERER FROM json_populate_record(NULL::test, '{"multiprocess": true,"http_referer": "http://localhost:9000/"}');

But, once the number of files become large, it becomes very difficult to use this technique. Is there any other way to do this effectively?

1 Answer 1

0

You could use a LATERAL JOIN to do insert more than one row at a time:

 WITH 
  json AS(
   VALUES('{"multiprocess": true,"http_referer":"http://localhost:9000"}')
     ,('{"multiprocess": false,"http_referer": "http://localhost:9001/"}')
     ,('{"multiprocess": true,"http_referer": "http://localhost:9002/"}')
) INSERT INTO test 
   SELECT multiprocess, http_referer 
   FROM   json, LATERAL json_populate_record(NULL::test, json.column1::json); 

Or you could insert into a staging table first and then populate your other table.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.