F
FcabralJ
Guest
I have a list containing 21 YYYYMM (years and months) which I loop though a Parallel processing, executing a query and populating a DataTable which further will be written into a csv file as you can see bellow.
Just the "Load" method in DataTable is taking 30 minutes to be executed, thereof I was looking to some suggestions how could I improve it.
lstFilterValues.AsParallel().ForAll(filterValue =>
{
using (SAConnection _conn = DB_Connection.Connect(_psfAcronym))
{
var queryWithFilter = mainQuery.Replace("@FILTER", filterValue.ToString());
IDataReader _dataReader = new SACommand(queryWithFilter, _conn).ExecuteReader();
var _dataTable = new DataTable();
_dataTable.Load(_dataReader);
List<T> lstTableRows = new List<T>(TransformToObject.DataTableToList<T>(_dataTable));
using (var sw = new StreamWriter(filePath))
using (var csv = new CsvWriter(sw))
{
csv.Configuration.Delimiter = UniversalVariables.csvDelimiter.ToString(); ;
csv.Configuration.HasHeaderRecord = true;
csv.Configuration.UseNewObjectForNullReferenceMembers = true;
csv.WriteHeader<T>();
csv.NextRecord();
csv.WriteRecords(lstRecords);
sw.Flush();
}
SybaseIQ_Connection.Disconnect(_conn);
}
});
The database table contains 21 million rows, which for each interaction should be something near 1 million.
Continue reading...
Just the "Load" method in DataTable is taking 30 minutes to be executed, thereof I was looking to some suggestions how could I improve it.
lstFilterValues.AsParallel().ForAll(filterValue =>
{
using (SAConnection _conn = DB_Connection.Connect(_psfAcronym))
{
var queryWithFilter = mainQuery.Replace("@FILTER", filterValue.ToString());
IDataReader _dataReader = new SACommand(queryWithFilter, _conn).ExecuteReader();
var _dataTable = new DataTable();
_dataTable.Load(_dataReader);
List<T> lstTableRows = new List<T>(TransformToObject.DataTableToList<T>(_dataTable));
using (var sw = new StreamWriter(filePath))
using (var csv = new CsvWriter(sw))
{
csv.Configuration.Delimiter = UniversalVariables.csvDelimiter.ToString(); ;
csv.Configuration.HasHeaderRecord = true;
csv.Configuration.UseNewObjectForNullReferenceMembers = true;
csv.WriteHeader<T>();
csv.NextRecord();
csv.WriteRecords(lstRecords);
sw.Flush();
}
SybaseIQ_Connection.Disconnect(_conn);
}
});
The database table contains 21 million rows, which for each interaction should be something near 1 million.
Continue reading...