Ah, code! First the questions you asked about...
You are already doing "substring" parsing (you use Mid(), a VB6 leftover). Some people define a line from their flat file in terms of Regular Expressions (using the RegEx object) and let that object do the parsing. Its very readable code (maybe not the regular expression itself), but slower than finding chunks of a string using Mid.
BCP is the fastest way to get data into the database. bcp is a command line tool for SQL Server that takes an input file and a layout file (defines the fields, the delimiter, whether strings have quotes on them, etc.) and sucks the file into the table - VERY fast.
DTS is the next best thing. In fact, if you design DTS correctly it will use bcp behind the scenes. It does allow you to "transform" the data, by sticking a piece of VBScript in the middle. In other words, it will generate a default script for you along the lines of:
Code:
Sub Main()
DTSDestination("Field1") = DTSSource("Field1")
DTSDestination("Field2") = DTSSource("Field2")
End Sub
In there you can write your own logic, including subroutines and more. DTS is also very fast and allows you to tweak the reading/writing. For small to medium sized complexity, Id use DTS over a custom EXE.
Since youre using VS2005, you may be able to use the new SqlBulkCopy object. I admit I havent had a chance to try it out yet, but it supposedly lets you write your VB.NET application as you want, and take advantage of BCP for speed.
Now, for your sample code:
One of the first things I would do is change all the string variables to use StringBuilder object. Every time you concatenate a string, it must allocate space for the new string and then copy the old string plus the new to the new location. Heres your code with some "dings" for each string copy:
Code:
Ding 1 for the ReadLine() method
If Len(input) > 15 Then
Ding 2
SQL = "INSERT INTO badges (..."
For X = 1 To UBound(FileTmplt)
If DTypeArr(X) = 1 Then
Ding 3
TmpVal = Mid(input, strCnt, FileTmplt(X))
Ding 4 (maybe more, inside of RepSngQut)
TmpVal = ADO.RepSngQut(TmpVal)
Ding 5 (I think only one ding, although multiple strings are concatenated)
SQL = SQL & "" & TmpVal & ", "
Else ...
End If
Next X
Ding 6
SQL = SQL & "" & FileSrch & ")"
ADO.ActionSQL(SQL, Cnn)
WriteToLog(FileLog.Current, "Reading Line")
End If
As the example shows, there are 6 dings (string copies) for each line in the file. If your file is 10 meg (relatively small for some jobs), thats like reading and copying around 60 meg of data.
Before making any changes, Id do some profiling to see how things look before/after. Some changes may help, some may not.
As you pointed out, you may also try using SqlParameter objects - you only have to build the collection of parameters one time, then reset the values on each loop.
You may also try the good old do-it-yourself-batching. That is, save up 20 to 50 of the final SQL statements into a string and execute them all at once. So the string will look like:
Code:
INSERT INTO badges (AssmDate, Acc, Loc, WearNum...
INSERT INTO badges (AssmDate, Acc, Loc, WearNum...
INSERT INTO badges (AssmDate, Acc, Loc, WearNum...
Then one call to ADO.ActionSQL(SQL, Cnn) to commit them all. May take a little extra work for logging, to handle problems with failed logs (like a separate routine that loops through the big string and commits each INSERT one at a time in case of bad data).
-ner