0

We have a Kafka topic with Cloud Events in JSON Schema format that we connected to Timescaledb/PostgreSQL with JDBC sink.

To convert a table to a hypertable we need a time column, but our date time format in the event generated by golang app contains nanoseconds: 2024-06-23T19:36:00.104215111Z and is not compatible with PostgreSQL timestamp with time zone format so Kafka Connect returns an error like this:

Batch entry 0 INSERT INTO \"test3_new\" (\"allowed\",\"specversion\",\"id\",\"source\",\"time\",\"type\",\"relation\",\"username\",\"object\") VALUES (('TRUE'),('1.0'),('1718828631'),('auth/v1/check'),('2024-06-19T20:23:51.18607753Z'),('auth.check'),('reader'),('user:user1592'),('cluster:example-cluster-us-east')) 
was aborted: ERROR: column \"time\" is of type timestamp with time zone but expression is of type character varying\n  
Hint: You will need to rewrite or cast the expression.

I couldn't find a way to convert data to a proper format like using Kafka Connect transforms or some built-in Timescale/Postgres functions for auto conversion that could work here.

1

1 Answer 1

1

In the meantime I found this issue: https://github.com/confluentinc/kafka-connect-jdbc/issues/921 and looks like adding ?stringtype=unspecified at the end of connection string works fine.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.