1

I am trying to do csv import and export data with postgresql (where the data base is at a remote host). Normally I would use the psql command to do \copy <table> from <local path> ... and \copy <table> to <local path> ... but I need to be able to do it via Go where I don't have access to shell or systems that don't have psql installed.

The data itself is expected to be pretty light (maybe < 2 MB of data together), hence I am trying not to implement any structs/schema of track the columns in the tables. When importing into DB, I want to library/code to infer the schema of the table and push the data to the tables.

Any suggestions on how to implement this? I am not sure if any of the Go database/sql or pgx or pq allow this without being able to specify columns. Any advice on this this?

Edit:

I ended up using https://github.com/joho/sqltocsv for DB export, which is pretty simple enough where I don't have to define any schema/structs.

I don't have the code but I tried gorm and realized I need to define some struct/schema for it.

6
  • Can you share with us what you have tried so far and what problems you ran into? Also, you say you're trying to implement structs but that you're unable to specify columns which implies you don't know the structure, can you clarify that? Do you know the structure of the data or do you not? Commented Aug 9, 2021 at 6:44
  • have you looked into gorm library? Commented Aug 9, 2021 at 7:09
  • 1
    Does this answer your question? Bulk insert from csv in postgres using golang without using for loop Commented Aug 9, 2021 at 8:46
  • @mkopriva, thats correct that I won't know the structure. I just know I'll need to export bunch of data and re-import it (to the same db/tables). Obviously I could use DB dump but the data in there is tougher to parse. So far whatever I have ends up using structs and schema thats defined for my tables. Commented Aug 10, 2021 at 23:12
  • @Gauranga. I have and unless I missed something I need to have some schema defined in my code. Ideally I would like to be able to do something simple as the psql commands. Commented Aug 10, 2021 at 23:13

2 Answers 2

3

I found way to do it with pgx package (thanks to @Gustavo Kawamoto suggestion). Here's my import and export:

package main

import (
    "fmt"
    "os"

    "github.com/jackc/pgx"
)

func main() {
    pgxConConfig := pgx.ConnConfig{
        Port:     5432,
        Host:     "remote_host",
        Database: "db_name",
        User:     "my_user",
        Password: "my_password",
    }

    conn, err := pgx.Connect(pgxConConfig)
    if err != nil {
        panic(err)
    }
    defer conn.Close()

    tables := []string{"table1", "table2", "table3",}

    import_dir := "/dir_to_import_from"
    export_dir := "/dir_to_export_to"
    
    for _, t := range tables {
        f, err := os.OpenFile(fmt.Sprintf("%s/table_%s.csv", import_dir, t), os.O_RDONLY, 0777)
        if err != nil {
            return
        }
        f.Close()

        err = importer(conn, f, t)
        if err != nil {
            break
        }

        fmt.Println("  Done with import and doing export")
        ef, err := os.OpenFile(fmt.Sprintf("%s/table_%s.csv", export_dir, t), os.O_CREATE|os.O_WRONLY, 0777)
        if err != nil {
            fmt.Println("error opening file:", err)
            return
        }
        ef.Close()

        err = exporter(conn, ef, t)
        if err != nil {
            break
        }
    }
}

func importer(conn *pgx.Conn, f *os.File, table string) error {
    res, err := conn.CopyFromReader(f, fmt.Sprintf("COPY %s FROM STDIN DELIMITER '|' CSV HEADER", table))
    if err != nil {
        return err
    }
    fmt.Println("==> import rows affected:", res.RowsAffected())

    return nil
}

func exporter(conn *pgx.Conn, f *os.File, table string) error {
    res, err := conn.CopyToWriter(f, fmt.Sprintf("COPY %s TO STDOUT DELIMITER '|' CSV HEADER", table))
    if err != nil {
        return fmt.Errorf("error exporting file: %+v", err)
    }
    fmt.Println("==> export rows affected:", res.RowsAffected())
    return nil
}
Sign up to request clarification or add additional context in comments.

1 Comment

This was really helpful, thanks! For anyone looking to specifically export CSV to a string with go-pg, you can use db.CopyTo(buffer, query) with a query like COPY (select 'foo' as "a", 'bar' as "b") TO STDOUT WITH CSV HEADER DELIMITER ','; and your buffer will contain the CSV export result, ready to be used elsewhere.
2

Checkout this pkg: https://github.com/chop-dbhi/sql-importer

  1. Automatic table creation
  2. Uniqueness and not null detection.
  3. Support for CSV files wider than 1600 columns (the Postgres limit)

1 Comment

I'm trying to use this but it requires providing table schema. Is this correct? I wish there were an example of schema to specify. I can specify the schema I used to create table but I have a requirement to export/import for schema thats not rigid.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.