For high performance data loads on postgres (which I think teradata is based on?), I’ve had by far the best results just generating raw “copy into” statements to stdout or a file and piping it straight into psql. If you use the pg_dump utility, you can take a look at the sql it generates and do the same.
Hope this helps.
—erik
On Oct 5, 2015, at 8:09 PM, simk0020 <[address removed]> wrote:
I am trying to upload a datafile to Teradata using python.
For smaller dataframes, I have used for loops with insert statment. However, this cannot be the proper way to do it.
I have heard of pd.to_sql function but have struggled with the sqlalchemy engine.
Looking for thoughts on the best way to do this.
Really appreciate the comments.
thanks,
-Vivek
--
Please Note: If you hit "REPLY", your message will be sent to everyone on this mailing list ([address removed])
This message was sent by Meetup on behalf of simk0020 from DC Python.
To report this message or block the sender, please click here
Set my mailing list to email me As they are sent | In one daily email | Don't send me mailing list messages
Meetup, POB 4668 #37895 NY NY USA 10163 | [address removed]