alanchinese
Well-known member
- Joined
- Jan 12, 2005
- Messages
- 58
i have a talbe that contains 60M to 100M rows of data
column name.....data type.....length
...key..................char...........42
...desc................varchar......200
...ref...................int...............4
all three columns are indexed and key is the primary key.
i am doing the following operation:
if (!DataIsInTable(key))
InsertIntoTable(key);
its fairly slow when the number of rows are 100M.
i wonder if there are some advanced method to improve its performance?
what happen if i dont check if the data is in the table? would the failing process faster than selecting?
thankx....
column name.....data type.....length
...key..................char...........42
...desc................varchar......200
...ref...................int...............4
all three columns are indexed and key is the primary key.
i am doing the following operation:
if (!DataIsInTable(key))
InsertIntoTable(key);
its fairly slow when the number of rows are 100M.
i wonder if there are some advanced method to improve its performance?
what happen if i dont check if the data is in the table? would the failing process faster than selecting?
thankx....