Im trying to find the best, fastest way to add a large number of rows to my server at a time. The way Im doing it now involves a loop that updates each row individually, which is a lengthy procedure. Is there a better way?
If this is SQL server, you cant really beat bcp though DTS is nearly as fast and is a lot more robust. For "batch jobs" that extract or import data from other sources (files, excel, another database) DTS is the recommended approach and is worth investigating.
Im using Visual Basic .net to interface with a Postgresql server. Im not sure if those terms apply to anything but MS Sql server. Im trying to add about 200,000 rows to this Postgresql server with the click of a button, and not have to wait all day for the operation to process. Any ideas?
If you must use code - eg, Postgresql doesnt support any kind of bulk copy - then Id recommend dynamic SQL, batching up mutliple inserts (if thats what you need) into one chunk of SQL. Finding a balance is up to you, but I generally start with 50 records or so. Something like:
For "batch jobs" where speed is a concern, you should generally try to avoid "good programming" practices of using Command objects and Paramater objects as theyre a lot slower than you probably want.
For 200,000 rows its a toss up - thats kinda in-between size so you may not need to sacrifice "good programming" for speed. Maintenance is also important.
If you decide to not use dynamic SQL, maybe let us know some more details: * what kind of code youve got so far
* how big each row is (how many columns and relative sizes
* what kind of machine youre running against (the server)
* what kind of data source you have (where do you get your data)
* what you expect in terms of speed (how long to run)
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.