9

I'm attempting to load TSV data from a file into a Postgres table using the \COPY command.

Here's an example data row:

2017-11-22 23:00:00     "{\"id\":123,\"class\":101,\"level\":3}"

Here's the psql command I'm using:

\COPY bogus.test_table (timestamp, sample_json) FROM '/local/file.txt' DELIMITER E'\t'

Here's the error I'm receiving:

ERROR:  invalid input syntax for type json
DETAIL:  Token "sample_json" is invalid.
CONTEXT:  JSON data, line 1: "{"sample_json...
COPY test_table, line 1, column sample_json: ""{\"id\":123,\"class\":101,\"level\":3}""

I verified the JSON is in the correct JSON format and read a couple similar questions, but I'm still not sure what's going on here. An explanation would be awesome

4 Answers 4

12

To load your data file as it is:

\COPY bogus.test_table (timestamp, sample_json) FROM '/local/file.txt' CSV DELIMITER E'\t' QUOTE '"' ESCAPE '\'
Sign up to request clarification or add additional context in comments.

4 Comments

Works perfectly in psql and adapated it to work in a python script, thanks dude!
did the trick for me too also for very complicated and nested and crazy in terms of content JSON fields.
@lemonmaster could you share a gist of the python script?
@moshevi unfortunately I can't do that since it's related to work (the script has since been changed). If you're using psycopg2 you can use cursor.copy_from(), which is what I'd recommend.
2

The answer of Aeblisto almost did the trick for my crazy JSON fields, but needed to modify an only small bit - THE QUOTE with backslash - here it is in full form:

COPY "your_schema_name.yor_table_name" (your, column_names, here) 
FROM STDIN 
WITH CSV DELIMITER E'\t' QUOTE E'\b' ESCAPE '\';
--here rows data
\.

1 Comment

I have a different but related problem: I want to export JSONB data as json text. I tried the above command but using "\copy" instead, and in output format, like \copy (select my_column from my_table where id='some_id') TO 'my_output.txt' WITH CSV DELIMITER E'\t' QUOTE '\b' ESCAPE '\'; which produced the error "quote must be a single one-byte character". Changing from QUOTE '\b' to QUOTE E'\b' fixed this and i get valid JSON output. I would imagine the same change works for the input if anyone has the the same problem.
1

Your json is quoted. It shouldn't have surrounding " characters, and the " characters surrounding the field names shouldn't be escaped.

It should look like this:

2017-11-22 23:00:00 {"id":123,"class":101,"level":3}

Comments

0

Here's a valid example that resolved the "Token .. is invalid" error in my case:

CSV:

1752100878124;{"a":5};{"b":6};821342710
1752100878125;{"a":7};{"b":8};821342710

The COPY command:

CREATE TABLE staging (
    timestamp BIGINT,
    f1 JSONB,
    f2 JSONB,
    checksum BIGINT
);


COPY staging (timestamp, f1, f2, checksum)
FROM '.../test.csv'
WITH (FORMAT csv, DELIMITER ';', HEADER false, QUOTE '^', ESCAPE '\');

Please note the QUOTE '^' which changes (default?) quote character `"` to something else allowing quoted JSON fields parsing.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.