8000 Postgres fast load · Issue #44 · csv2db/csv2db · GitHub
[go: up one dir, main page]

Skip to content

Postgres fast load #44

@sunilred

Description

@sunilred

Hi Gerald,

First off this is a great tool. Thanks very much.

My use case is to upload CSV files into AWS Postgres DB for data analytics (Windows environment). Your tool works great however it is a bit slow when dealing with my data volumes. I get files with a million rows that need to be uploaded to Postgre on a daily basis.

For testing, I loaded 20000 rows and it took 2-3 mins, while using pgsql COPY load it took 2 seconds.

I was wondering if there is a way to tap into using pgsql for a faster upload?

I am assuming your process, in order to be generic, uses INSERT statements behind the scene?

Also I was wondering if we can add a few more options to "csv2db" that I think others users may find useful - and hopefully you see value in it.

Additional useful options for "csv2db load"

  1. Truncate table and load (optional)
  2. Append to target table (default)

Additional useful option for "csv2db generate"
3. Create table in the target database (the default fornow i.e. to print the CREATE TABLE script)

Let me know what you think.

Thanks
Sunil

Metadata

Metadata

Assignees

Labels

questionFurther information is requested

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions

    0