I am trying to do csv import and export data with postgresql (where the data base is at a remote host). Normally I would use the psql
command to do \copy <table> from <local path> ...
and \copy <table> to <local path> ...
but I need to be able to do it via Go where I don't have access to shell or systems that don't have psql installed.
The data itself is expected to be pretty light (maybe < 2 MB of data together), hence I am trying not to implement any structs/schema of track the columns in the tables. When importing into DB, I want to library/code to infer the schema of the table and push the data to the tables.
Any suggestions on how to implement this? I am not sure if any of the Go database/sql
or pgx
or pq
allow this without being able to specify columns. Any advice on this this?
Edit:
I ended up using https://github.com/joho/sqltocsv for DB export, which is pretty simple enough where I don't have to define any schema/structs.
I don't have the code but I tried gorm
and realized I need to define some struct/schema for it.