0

My table (table) has a JSONB field (data) that contains a field with an array where I store tags (tags). I query that table with an expression like:

SELECT * FROM table WHERE data->'tags' @? '$[*] ? (@ like_regex ".*(foo|bar).*" flag "i");

With such use-case is there a way for me to index the data->'tags' array to speed up the query? Or should I rather work on moving the tags array out of the JSONB field and into a TEXT[] field and index that?

I've already tried:

CREATE INDEX foo ON tbl USING GIN ((data->'tags') jsonb_path_ops);

but it doesn't work: https://gist.github.com/vkaracic/a62ac917d34eb6e975c4daeefbd316e8

1
  • 1
    testing an index on table with only one row is completely meaningless. A Seq Scan will always be the best choice for such a tiny table. You need to populate it with a lot more data ("hundreds of thousand") to test the efficiency of the index Commented Apr 28, 2021 at 20:42

1 Answer 1

2

The index you built can be used (if you set enable_seqscan=off, you will see that it does get used), but it is generally not chosen as it is pretty useless for this query. The only rows it would rule out through the index are the ones that don't have the 'tags' key at all, and even at that is poorly estimated so probably won't be used without drastic measures.

You could try to convert to text[] and the use parray_gin, but probably better would be to convert to a child table with text and then use pg_trgm.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.